How to Destroy Old 404 Pages
-
Hello Mozzers,
So I just purchased a new domain and to my surprise it has a domain authority of 13 right out of the box (what luck!). I needed to investigate. To make a long story short the domain used to be home to a music blog that had hundreds of pages which of course are all missing now. I have about 400 pages on my hands that are resulting in a 404. How or what is the best method for eliminating these pages.
Does deleting the Crawl Errors in Google Webmaster Tools do anything?
Thanks
-
What a thorough response! I'm in the Option B scenario. The old content has nothing to do with my site so I don't need to redirect the old URLs. I will just wait out Google crawling those 404s.
Thanks!
-
You have a few options here. Option A is if you are going to build a site that will have similar topic based content as the old one and you want to use a larger portion of that domain authority from the old site to the new.
-
Pull those 404 errors from GWT in a spreadsheet. This gives you a corpus of links to work with.
-
Go into Bing WT and they have a way to browse what they have and had indexed. What is nice here is that Bing will tell you what URLs (even old 404s) have links to them.
-
Run your links through Open Site Explorer. You can then also get linking data, FB and Twitter data in addition to OSE data on the old URLs
-
If need be, run the more important dead URLs through the Wayback Machine http://archive.org/web/web.php you can now even see what the actual content was on the old URLs.
-
After doing all of this, pretty quick you should be able to see if there were any authority pages on the site that have now expired and you also know what those pages were about via the wayback machine.
-
On the authority pages, create new pages on the new site that have to do with the same topic, i.e. semantically related to the old page.
-
301 the old authority pages to the new authority pages.
-
The rest of the URLs you can just let them 404. They will continue to 404 several time until Google drops them. I would leave them in GWT as over time they should drop out as Google starts to ignore those pages, this may take a few months. You can then just check GWT for any new 404s that might show up from the new site and you need to deal with.
One thing to note on all of this. You may have to let the old sitemap 404 vs redirecting the sitemap.
http://moz.com/blog/how-to-fix-crawl-errors-in-google-webmaster-tools
"One frustrating thing that Google does is it will continually crawl old sitemaps that you have since deleted to check that the sitemap and URLs are in fact dead. If you have an old sitemap that you have removed from Webmaster Tools, and you don’t want being crawled, make sure you let that sitemap 404 and that you are not redirecting the sitemap to your current sitemap."
If you delete the 404s from GWT the next time Google spiders the old pages they will just show up again, up to you then.
Option B - if you dont care about the old pages, just let them 404 as mentioned above, but be aware of the issue with old sitemaps. You can check the Google index for old URLs in the SERPs or also if you look into GWT and look for data on your Search Traffic. Make sure that the old URLs are not showing up under your Search Queries.
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Splitting down pages
Hello everyone, I have a page on my directory for example:
Intermediate & Advanced SEO | | SamBayPublishing
https://ose.directory/topics/breathing-apparatus The title on this page is small yet a bit unspecific:
Breathing Apparatus Companies, Suppliers and Manufacturers On webmaster tools these terms hold different values for each category so "topic name companies" sometimes has a lot more searches than "topic name suppliers". I was thinking if I could split the page into the following into three separate pages would that be better: https://ose.directory/topics/breathing-apparatus (main - Title: Breathing Apparatus)
https://ose.directory/topics/breathing-apparatus/companies (Title: Breathing Apparatus Companies)
https://ose.directory/topics/breathing-apparatus/manufacturers (Title: Breathing Apparatus Manufacturers)
https://ose.directory/topics/breathing-apparatus/suppliers (Title: Breathing Apparatus Suppliers) Two Questions: Would this be more beneficial from an SEO perspective? Would google penalise me for doing this, if so is there a way to do it properly. PS. The list of companies may be the same but the page content ever so slightly different. I know this would not effect my users much because the terms I am using all mean pretty much the same thing. The companies do all three.0 -
Substantial difference between Number of Indexed Pages and Sitemap Pages
Hey there, I am doing a website audit at the moment. I've notices substantial differences in the number of pages indexed (search console), the number of pages in the sitemap and the number I am getting when I crawl the page with screamingfrog (see below). Would those discrepancies concern you? The website and its rankings seems fine otherwise. Total indexed: 2,360 (Search Consule)
Intermediate & Advanced SEO | | Online-Marketing-Guy
About 2,920 results (Google search "site:example.com")
Sitemap: 1,229 URLs
Screemingfrog Spider: 1,352 URLs Cheers,
Jochen0 -
After adding a ssl certificate to my site I encountered problems with duplicate pages and page titles
Hey everyone! After adding a ssl certificate to my site it seems that every page on my site has duplicated it's self. I think that is because it has combined the www.domainname.com and domainname.com. I would really hate to add a rel canonical to every page to solve this issue. I am sure there is another way but I am not sure how to do it. Has anyone else ran into this problem and if so how did you solve it? Thanks and any and all ideas are very appreciated.
Intermediate & Advanced SEO | | LovingatYourBest0 -
Is it a problem to use a 301 redirect to a 404 error page, instead of serving directly a 404 page?
We are building URLs dynamically with apache rewrite.
Intermediate & Advanced SEO | | lcourse
When we detect that an URL is matching some valid patterns, we serve a script which then may detect that the combination of parameters in the URL does not exist. If this happens we produce a 301 redirect to another URL which serves a 404 error page, So my doubt is the following: Do I have to worry about not serving directly an 404, but redirecting (301) to a 404 page? Will this lead to the erroneous original URL staying longer in the google index than if I would serve directly a 404? Some context. It is a site with about 200.000 web pages and we have currently 90.000 404 errors reported in webmaster tools (even though only 600 detected last month).0 -
301 redirect or Link back from old to new pages
Hi all, We run a ticket agent, and have multiple events that occur year after year, for example a festival. The festival has a main page with each event having a different page for each year like the below: Main page
Intermediate & Advanced SEO | | gigantictickets
http://www.gigantic.com/leefest-tickets (main page) Event pages:
http://www.gigantic.com/leefest-2010-tickets/hawksbrook-lane-beckenham/2009-08-15-13-00-gce/11246a
http://www.gigantic.com/leefest-2010-tickets/highhams-hill-farm-warlingham/2010-08-14-13-00-gce/19044a
http://www.gigantic.com/leefest-2011-tickets/highhams-hill-farm-warlingham/2011-08-13-13-00-gce/26204a
http://www.gigantic.com/leefest-2012-tickets/highhams-hill-farm-warlingham/2012-06-29-12-00-gce/32168a
http://www.gigantic.com/leefest-2013/highhams-hill-farm/2013-07-12-12-00 my question is: Is it better to leave the old event pages active and link them back to the main page, or 301 redirect these pages once they're out of date? (leave them there until there is a new event page to replace it for this year) If the best answer is to leave the page there, should i use a canonical tag back to the main page? and what would be the best way to link back? there is a breadcrumb there now, but it doesn't seem to obvious for users to click this. Keywords we're aming for on this example are 'Leefest Tickets', which has good ranking now, the main page and 2012 page is listed. Thanks in advance for your help.0 -
How do I best deal with pages returning 404 errors as they contain links from other sites?
I have over 750 URL's returning 404 errors. The majority of these pages have back links from sites, however the credibility of these pages from what I can see is somewhat dubious, mainly forums and sites with low DA & PA. It has been suggested placing 301 redirects from these pages, a nice easy solution, however I am concerned that we could do more harm than good to our sites credibility and link building strategy going into 2013. I don't want to redirect these pages if its going to cause a panda/penguin problem. Could I request manual removal or something of this nature? Thoughts appreciated.
Intermediate & Advanced SEO | | Towelsrus0 -
Internal page ranking
If I have a domain: Example.com I want this domain to rank for several keywords. I build a page on the domain called Example.com/glasses If I SEO the page Example.com/glasses with backlinks etc... will that URL come up on google or will it simply bring Example.com up on the SERPS. If I have three keywords, should I make a subpage for each page and SEO that page? Will that make the domain rank for all three keywords?
Intermediate & Advanced SEO | | JML11790 -
Pages with Little Content
I have a website that lists events in Dublin, Ireland. I want to provide a comprehensive number of listings but there are not enough hours in the day to provide a detailed (or even short) unique description for every event. At the moment I have some pages with little detail other than the event title and venue. Should I try and prevent Google from crawling/indexing these pages for fear of reducing the overall ranking of the site? At the moment I only link to these pages via the RSS feed. I could remove the pages entirely from my feed, but then that mean I remove information that might be useful to people following the events feed. Here is an example page with very little content
Intermediate & Advanced SEO | | andywozhere0