When the site's entire URL structure changed, should we update the inbound links built pointing to the old URLs?
-
We're changing our website's URL structures, this means all our site URLs will be changed. After this is done, do we need to update the old inbound external links to point to the new URLs? Yes the old URLs will be 301 redirected to the new URLs too. Many thanks!
-
In a perfect world if we could control all the links that link to our site we would have them link to the new URLs. However what I was trying to illustrate is that when you change your URL there is a method of making sure that the old directly which are going to be pointing to your older URL structure Wilburn point to your new URL structure via 301 redirect.
Considering you are changing the URLs with links e.g example.com/old must 301 redirect to example.com/new
-
Thank you very much, Tom!
What I also want to know is, the many external links which linking to the site's older URLs, after the URL changes (and 301 direct), is it necessary to change the links built on the other sites to point to the new URLs instead of the old URLs? If not, what's the disadvantages? This is also a problem many experience after switching to https.
-
They are exactly right as long as the page is the same page. You will get a proper 301 without loss of link juice.
However, if you create new URLs that are not relevant to the old URLs and 301 redirects them, they will become a soft 404 error something that is dangerous.
I would index my site using https://deepcrawl.com or https://www.screamingfrog.co.uk/seo-spider/
Take the URLs that I am going to change along with the back links and download them to a CSV file.
After that, I would take the site with the new URL structure And compare it my old URL structure
You can also upload your backlinks and make sure that they are going to the same URL ID you can compare the two using deep crawl or upload using screaming frog
This would allow you to synchronize your changes.
See
https://www.deepcrawl.com/knowledge/best-practice/what-you-need-to-prepare-for-a-website-relaunch/
https://www.deepcrawl.com/migrating-a-website-with-deepcrawl/
TEST YOUR NEW XML SITEMAP
Testing a new XML Sitemap before you put it live means you can see whether any important URLs are missing from the new version, and also whether your new version is looking as expected.
(DeepCrawl: Put your legacy URLs in the sitemap and only start highlighting URLs from your new site once they have been indexed. This may take up to three months).
CRAWL THE NEW SITE WITH MODIFIED URLS
You can replace any absolute links to your live site with URLs from your staging site using the URL Rewrite function. This is useful if you are adding a specific section to your site and want to test how it will perform within the wider context of your site architecture.
https://www.deepcrawl.com/knowledge/best-practice/guide-to-url-design/
I hope this helps
Tom
-
Thank you very much!
-
Hi Jade.
Best case scenario you should if you can update the URL of the links.
Considering that there will be 301 redirects, the loss will not be that much. There's a blog post of Cyrus Shepard talking about the impact of redirects and states that the PR loss is about 15%.
https://moz.com/blog/301-redirection-rules-for-seo
Best luck.
GR
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
URL change - Sitemap update / redirect
Hi everyone Recently we performed a massive, hybrid site migration (CMS, URL, site structure change) without losing any traffic (yay!). Today I am finding out that our developers+copy writers decided to change Some URLs (pages are the same) without notifying anyone (I'm not going into details why). Anyhow, some URLs in site map changed, so old URLs don't exist anymore. Here is the example: OLD (in sitemap, indexed): https://www.domain.com/destinations/massachusetts/dennis-port NEW: https://www.domain.com/destinations/massachusetts/cape-cod Also, you should know that there is a number of redirects that happened in the past (whole site) Example : Last couple years redirections: HTTP to HTTPS non-www to www trailing slash to no trailing slash Most recent (a month ago ) Site Migration Redirects (URLs / site structure change) So I could add new URLs to the sitemap and resubmit in GSC. My dilemma is what to do with old URL? So we already have a ton of redirects and adding another one is not something I'm in favor of because of redirect loops and issues that can affect our SEO efforts. I would suggest to change the original, most recent 301 redirects and point to the new URL ( pre-migration 301 redirect to newly created URL). The goal is not to send mixed signals to SEs and not to lose visibility. Any advice? Please let me know if you need more clarification. Thank you
Intermediate & Advanced SEO | | bgvsiteadmin0 -
Google WMT/search console: Thousands of "Links to your site" even only one back-link from a website.
Hi, I can see in my search console that a website giving thousands of links to my site where hardly only one back-link from one of their page to our page. Why this is happening? Here is screenshot: http://imgur.com/a/VleUf
Intermediate & Advanced SEO | | vtmoz0 -
Weird behavior with site's rankings
I have a problem with my site's rankings.
Intermediate & Advanced SEO | | Mcurius
I rank for higher difficulty (but lower search volume) keywords , but my site gets pushed back for lower difficulty, higher volume keywords, which literally pisses me off. I thought very seriously to start new with a new domain name, cause what ever i do seems that is not working. I will admit that in past (2-3 years ago) i used some of those "seo packages" i had found, but those links which were like no more than 50, are all deleted now, and the domains are disavowed.
The only thing i can think of, is that some how my site got flagged as suspicious or something like that in google. Like 1 month ago, i wrote an article about a topic related with my niche, around a keyword that has difficulty 41%. The search term in 1st page has high authority domains, including a wikipedia page, and i currently rank in the 3rd place. In the other had, i would expect to rank easily for a keyword difficulty of 30-35% but is happening the exact opposite.The pages i try to rank, are not spammy, are checked with moz tools, and also with canirank spam filters. All is good and green. Plus the content of those pages i try to rank have a Content Relevancy Score which varies from 98% to 100%... Your opinion would be very helpful, thank you.0 -
Chinese Sites Linking With Bizarre Keywords Creating 404's
Just ran a link profile, and have noticed for the first time many spammy Chinese sites linking to my site with spammy keywords such as "Buy Nike" or "Get Viagra". Making matters worse, they're linking to pages that are creating 404's. Can anybody explain what's going on, and what I can do?
Intermediate & Advanced SEO | | alrockn0 -
Brackets vs Encoded URLs: The "Same" in Google's eyes, or dup content?
Hello, This is the first time I've asked a question here, but I would really appreciate the advice of the community - thank you, thank you! Scenario: Internal linking is pointing to two different versions of a URL, one with brackets [] and the other version with the brackets encoded as %5B%5D Version 1: http://www.site.com/test?hello**[]=all&howdy[]=all&ciao[]=all
Intermediate & Advanced SEO | | mirabile
Version 2: http://www.site.com/test?hello%5B%5D**=all&howdy**%5B%5D**=all&ciao**%5B%5D**=all Question: Will search engines view these as duplicate content? Technically there is a difference in characters, but it's only because one version encodes the brackets, and the other does not (See: http://www.w3schools.com/tags/ref_urlencode.asp) We are asking the developer to encode ALL URLs because this seems cleaner but they are telling us that Google will see zero difference. We aren't sure if this is true, since engines can get so _hung up on even one single difference in character. _ We don't want to unnecessarily fracture the internal link structure of the site, so again - any feedback is welcome, thank you. 🙂0 -
New Site Structure and 301s
We're moving towards a new site with new site structure. The old site has numerous backlinks to past events that won't be published on the new site. The new site will have about 60 future events that are currently active on the old site as well. I was wondering the best way to move forward with the 301 redirect plan. I was considering redirecting the old site structure to an "archive.ourdomain.co.uk" subdomain and redirecting the 60 or so active events to their equivalents on the new site. Would this be a sensible plan? Also for the active events, is there any difference between: _redirecting the old page to the archive page and then forwarding to the equivalent on the new page _ and redirecting the old page directly to the new page
Intermediate & Advanced SEO | | chanm790 -
Changing Site URLs
I am working on a new client that hasn't implemented any SEO previously. The site has terrible url nomenclature and I am wondering if it is worth it to try and change it. Will I lose rankings? What is the best url naming structure? Here's the website http://www.formica.com/en/home/TradeLanding.aspx. (I am only working on the North America site.) Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Migrating a site with new URL structure
I recently redesigned a website that is now in WordPress. It was previously in some odd, custom platform that didn't work very well. The URL's for all the pages are now more search engine friendly and more concise. The problem is, now Google has all of the old pages and all of the new pages in its index. This is a duplicate problem since content is the same. I have set up a 301 redirect for every old URL to it's new counterpart. I was going to do a remove URL request in Webmaster Tools but it seems I need to have a 404 code and not a 301 on those pages to do that. Which is better to do to get the old URL's out of the index? 404 them and do a removal request or 301 them to the new URL? How long will it take Google to find these 301 redirects and keep just the new pages in the index?
Intermediate & Advanced SEO | | DanDeceuster0