600+ 404 Errors: Best Practice for Redirects?
-
Hi All,
I've just checked my GWMT profile for one of my client's sites and found that there are currently over 600 404 Error notifications! This is not that surprising given that we very recently redesigned and launched their new corporate site, which previously had a ton of "junk" legacy pages.
I was wondering if it would work in terms of efficient SEO to simply apply a 301 redirect from the 404 page to our root to solve this issue?
If not what would be a good solution?
Thanks in advance for all your great advice!
-
Thanks, Takeshi.
As it turned out, most of the 404s would not have been passing very much juice anyway. We did decide to go ahead and hedge our bets by simply redirecting the majority of them to the root anyway and marked them as fixed in GWMT.
-
Want to save hours and hours of mindless repetitive coding?
Assuming you are on an Apache Linux server you will have a .htaccess file and this is where you will want to put your redirects.
I noticed in the comments one suggestion to just post a new sitemap and forget them as Google will too and this is true. So first you need to answer one question. Does any of the 600 have SEO Juice?
For those that are no, just forget them. It does not hurt SEO to have 404 except for the page that actually has the 404.
If you say they have PR and SEO value then the last thing you will want to do is let them just die off. You will want, as Mr. Young Suggested, setup a list to prioritize your links. The SEOmoz csv file is great and then filter out the not needed entries.
So you have your list and chances are that several will go to the same page and then you need to format the rewrite which is:
Options +Indexes
Options +FollowSymlinks
RewriteEngine on
RewriteBase /
RewriteRule ^old.htm$ http://www.new.com/ [R=301,L]with the " RewriteRule ^old.htm$ http://www.new.com/ [R=301,L] being the part you rewrite for each 404 page.
As a easy fast and much better option, you can use the mass rewrite generator that you find at http://seo-website-designer.com/HtAccess-301-Redirect-Generator which will speed up your efforts a ton. Hope that helps.
-
Generally you should try to 301 redirect to pages that have the same or similar content to what was on there before. Try creating a list of all the 404 errors, and the sites that are linking to them. Then, you can prioritize the list based on the pagerank/page authority of the pages that are linking to those missing pages.
-
Did your re-design include deleting these pages? If so, then just create a new sitemap.xml and submit it to GWMT then mark it as fixed. They usually stop searching for the missing page pretty quickly...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Spike in server errors
Hi, we've recently changed shopping cart platforms. In doing so a lot of our URL's changed, but I 301'ed all of the significant landing pages (as determined by G Analytics) prior to the switch. However, WMT is warning me about this spike in server errors now with all the pages that no longer exist. However they are only crawling them because they used to exist/are linked from pages that used to exist. and no longer actually exist. Is this something I should worry about? Or let it run its course?
Technical SEO | | absoauto0 -
Cannot work out why a bunch of urls are giving a 404 error
I have used the Crawl Diagnostic reports to greatly reduce the number of 404 errors but there is a bunch of 16 urls that were all published on the same date and have the same referrer url but I cannot see the woood for trees as to what is causing the error. **The 404 error links have the structure:**http://www.domainname.com/category/thiscategory/page/thiscategory/this-is-a-post The referrer structure is: http://www.domainname.com/category/thiscategory/page/2/ Any suggestions as to how to unravel this would be appreciated.
Technical SEO | | Niamh20 -
Crawl errors: 301 (permanent redirect)
Hi, here are some questions about SEO Crawl Diagnostics. We've recently found out this 301 (permanent redirect) errors in our website and we concluded that the two factors below are the causes. 1. Some of our URLs that has no / at the end is automatically redirected to the same URL but with / at the end. 2. For SEO reasons we have designed our website in a way that when we type in a URL it will automatically redirect to a more SEO friendly URL. For example, if one of the URLs is www.example.com/b1002/, it will automatically redirect to www.example.com/banana juice/. The question is, are these so significant for our SEO and needs to be modified? One of the errors in our blog was having too many on-page links. Is this also a significant error and if so, how many on-page links are recommended from the SEO perspective? Thanks in advance.
Technical SEO | | Glassworks0 -
Auto-loading content via AJAX - best practices
We have an ecommerce website and I'm looking at replacing the pagination on our category pages with functionality that auto-loads the products as the user scrolls. There are a number of big websites that do this - MyFonts and Kickstarter are two that spring to mind. Obviously if we are loading the content in via AJAX then search engine spiders aren't going to be able to crawl our categories in the same way they can now. I'm wondering what the best way to get around this is. Some ideas that spring to mind are: detect the user agent and if the visitor is a spider, show them the old-style pagination instead of the AJAX version make sure we submit an updated Google sitemap every day (I'm not sure if this a reasonable substitute for Google being able to properly crawl our site) Are there any best practices surrounding this approach to pagination? Surely the bigger sites that do this must have had to deal with these issues? Any advice would be much appreciated!
Technical SEO | | paul.younghusband0 -
Industry News Page Best Practices
Hi, We have created an industry news page which automatically curates articles from specific news sources within our sector. Currently, I have the news index page set to be indexed and followed by robots. I have the article pages noindex, nofollow, since these are not original content. Is this the best practice or do you recommend another configuration? Thanks!
Technical SEO | | JoshGFialkoff0 -
Domain Redirect Issues
Hi, I have a domain that is 10 years old, this is the old domain that used to be the website for the company. The company approximately 7 years ago was bought by another and purchased a new domain that is 7 years old. The company did not do a 301 redirect as they were not aware of the SEO implications. They continued building web applications on the old domain while using the new domain for all marketing and for business partner links. They just put in a server level redirect on the folders themselves to point to the new root. I am on Tomcat, I do not have the option of a 301 redirect as the web applications are all hard coded links (non-relative) (hundreds of thousands of dollars to recode) After beginning SEO; Google is seeing them as the same domain, and has replaced all results in Google with the old domain instead of the new one..... My questions is.... Is it better to take the hit and just put a robots.txt to disallow all robots on the old domain Or... Will that hurt my new domain as well since Google is seeing them as the same? Or.... Has Google already made the switch without a redirect to see these as the same and i should just continue on? (even the cache for the new site shows the old domain address) Old Domain= www.floridahealthcares.com New = www.fhcp.com *****Update after writing this I began changing index.htm to all non relative links so all links on the old domain homepage would point to fhcp.com fixing the issue of the entire site being replicated under the old domain. I think this might "Patch" my issue, but i would still love to get the opinion of others Thanks Shane
Technical SEO | | Jinx146780 -
Https redirect
Hi there, a client of mine is asking me if Google would penalize to redirect from all the http urls to https (they want to change the security protocol). I assume it is going to work as a classic 301, right? so they might lose some authority in they way, but I am not 100% sure. Can anyone confirm this? does anyone has a similar experience? thanks a lot!
Technical SEO | | elisainteractive0