How to de-index old URLs after redesigning the website?
-
Thank you for reading.
After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain).
It would be nonsense to 301 redirect them as there are to many URLs. (or would it be nonsense?)
What is the best way to deal with this issue?
-
Thank you Clever PhD, really valuable insights!
-
I completely agree with all of the above - I've taken her point more like my own. Where receiving thousands of annoying 404 errors from pages that haven't existed for many months just gets annoying!
-
I respectfully disagree with all of the above. Please repeat after me, 404s are not bad, they are diagnostic, 404s are not bad, they are diagnostic, 404s are not bad, they are diagnostic.
After redesigning my website (5 months ago) in my crawl reports (Moz, Search Console) I still get tons of 404 pages which all seems to be the URLs from my previous website (same root domain).
**Part 1 Internal links that 404s from Moz Crawl: **The 404s that show up in the Moz crawl are only going to be from an internal link on your website. The Moz crawl only looks at internal links and not links from other website. In other words, if you see 404s in your Moz crawl, that means, somewhere, you are linking to those pages and that is why the 404s are showing up. Download the CSV and you will find them in your Moz crawl. Other tools such as screaming frog, Botify, Deep Crawl, will show you a similar analysis.
Simple solution. Go through your code and remove the internal links on your site that direct the Moz crawler to those pages and the 404s will go away. (FYI this same approach will work for any internal 301s) These 404 errors in the Moz report are great diagnostic signals on where to fix your site. It is bad for users to click on a link within your website and get sent to a page that does not exist.
**Part 2 external links from Search Console: **The 404s that show up in Search console can come from your internal links on your site AND external links from other sites. Google will keep trying to crawl these links due to other sites linking to pages on your site and your own internal links. For internal link fixing - see suggestion above. For external links you need a different approach.
Look at the external links, where are they coming from? Are they from quality websites? Do they go to formerly important pages on your websites (ie pages that were good converters? If so, then use the 301 redirect to send them to the correct replacement page (and this is not always the home page). You get users to the correct page and also any link equity is passed along as well and this can help with your site rankings. If the link goes to former page on your site that was not any good to start with and the links that come into it are poor quality, then you just let the page 404. Tools such as Moz Open Site Explorer or Ahrefs or Majestic can help with this assessment - but usually you can just look at a site linking to you and tell if it is crap or not.
You need to consider the above regardless of if you want to get the pages that are 404ing in question out of the Google index as if you get Google to remove the page from the index, it will then see the internal link on your site and then find the 404 again. If you have removed the links to the 404 pages on your site, eventually Google will stop crawling them and drop out of the index.
Important note regarding the use of robots.txt. Blocking Google from crawling the 404s will not remove the pages from the index, Google will just stop crawling them. Google has to be able to crawl the URL to see the 404 and then see that it is a bad page and then remove the page from the index. Blocking with robots.txt stops Google from doing that. As soon as you take the page out of robots Google will recrawl and the 404 shows up again. Robots.txt treats a symptom that is a red herring, allowing the 404 to occur takes care of the issue permanently.
Dead pages are a natural part of the web. Let Google see the 404 (if it truly is a page that should 404 and has no link equity that should be passed along with a 301). Google will crawl the 404 several times, you will see it in search console several times. It is ok. You are not penalized for X number of 404s. You may lose ranking if you 404 a page that Google used to rank well, but this is just because Google will not keep a page highly ranked that does not exist :-). Help Google out by cleaning up your internal link structure so when it sees that you do not link to the page any more, then that is a signal that the page should 404. Google knows that due to the nature of the web, pages will time out on occasion and show an error. Google will continue to recrawl a page just to make sure, it wants to give you the benefit of the doubt. Therefore, you have to give clear directives by not linking to dead pages so that after Google double and triple checks the page, it will finally drop it. You will see the 404 in your Search Console for several months then it will eventually go away.
Hope that makes sense. Good luck!
-
Hey Lana, If you really think that 301 does not make sense in that case you can always add the URLs in the robots.txt file and once Google will recrawl your website, Google will de-index the pages from the index.
Another thing you can do is using the de-index feature in Google webmaster tool. You can do that by getting in to your GWT, Optimization > Remove URLs and do that accordingly.
Hope this helps!
-
I see the point. Thanks Liam. As the most of our 404 pages starts with /en-GB/ i will do like this:
Disallow: /en-GB/
-
Hi Lana,
I've been having the same problem on one of our websites. I've been 301 redirecting over 5,000 URL's but still receive a lot of 404 errors. One of the main reasons for these 404 errors still appearing is other bots such as Bing Bot that is still crawling the old URL's.
To resolve this, I would just block them in your robots.txt file. We blocked our old product URL's that were under a "product directory like this:
User-agent: *
Disallow: /product/
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Staging website got indexed by google
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag
Intermediate & Advanced SEO | | Asmi-Ta0 -
Should I have multiple websites for my different brands or one main website with different tabs/areas?
My client creates apps. As well as the apps they create themselves, they have made some of their own that cover various different topics. Currently they have individual websites for each of these apps, and a website for their app making business. They are asking whether they should just have one website - their app building site, which also includes information about the two apps they've built themselves. My feeling is it's better to keep them separate. The app building site is trying to appeal to a B2B audience and gain business to build new apps. AppA is trying to help carehomes and carers to streamline their business, and AppB is trying to help workplace and employee welfare. Combining them all will mean lots of mixed messaging/keywords even if we have dedicated areas on the site. I also think it will limit how much content we can create on each without being completely overwhelming for the user. If we keep them all separate then we can have a very clear user journey. I would of course recommend having blog posts or some sort of landing page to link to AppA and AppB's websites. Thoughts? Thank you!
Intermediate & Advanced SEO | | WhitewallGlasgow0 -
Changing URL to a subdomain?
Hi there, I had a website www.footballshirtcollective.com that has been live since July. It contains both content and eCommerce. I am now separating out the content so that; 1. The master domain is www.footballshirtcollective.com (content) pointing to a new site 2. Subdomain is store.footballshirtcollective.com (ecommerce) - pointing to the existing site. What do you advise I can do to minimise the impact on my search? Many thanks Mike
Intermediate & Advanced SEO | | mjmaxwell0 -
What are the best practices with website redesign & redirects?
I have a website that is not very pretty but has great rankings. I want to redesign the website and loose as little rankings as possible and still clean up the navigation. What are the best practices? Thanks in advance.
Intermediate & Advanced SEO | | JHSpecialty0 -
How important is the optional <priority>tag in an XML sitemap of your website? Can this help search engines understand the hierarchy of a website?</priority>
Can the <priority>tag be used to tell search engines the hierarchy of a site or should it be used to let search engines know which priority to we want pages to be indexed in?</priority>
Intermediate & Advanced SEO | | mycity4kids0 -
To index search results or not?
In its webmaster guidelines, Google says not to index search results " that don't add much value for users coming from search engines." I've noticed several big brands index search results, and am wondering if it is generally OK to index search results with high engagement metrics (high PVPV, time on site, etc). We have an database of content, and it seems one of the best ways to get this content in search engines would be to allow indexing of search results (to capture the long tail) rather than build thousands of static URLs. Have any smaller brands had success with allowing indexing of search results? Any best practices or recommendations?
Intermediate & Advanced SEO | | nicole.healthline0 -
Best url structure
I am making a new site for a company that services many cities. I was thinking a url structure like this, website.com/keyword1-keyword2-keyword3/cityname1-cityname2-cityname3-cityname4-cityname5. Will this be the best approach to optimize the site for the keyword plus 5 different cities ? as long as I keep the total url characters under the SeoMoz reccomended 115 characters ? Or would it be better to build separate pages for each city, trying to reword the main services to try to avoid dulpicate content.
Intermediate & Advanced SEO | | jlane90 -
Domains for regional websites
Please take a look at 7city.com This landing page contains links to: www.7city.co.uk www.7city.ae www.7city.com.sg and our US website which is also www.7city.com It is programmed so: If you are a first time user and type www.7city.com you go to the landing page above. If you then click on AMERICAS, it sets a cookie and directs you to http://www.7city.com/home . When you revisit www.7city.com in the future as the cookie is set you will be automatically sent to the AMERICAS website i.e http://www.7city.com/home. Our US websites is nor performing well on organic ranking compared to other regional website. Is the above technique hindering our organic ranking in the US. Also, I have been led to believe that you get a higher ranking if the domain is specific to a country. Is this true? Does 7city.com receive higher ranking than if I created it as 7city.us for example? Many Thanks Mark
Intermediate & Advanced SEO | | markc-1971830