Seek help correcting large number of 404 errors generated, 95% traffic halt
-
Hi, The following GWT screen tells a bit of the story:
site: http://bit.ly/mrgdD0
http://www.diigo.com/item/image/1dbpl/wrbp
On about Feb 8 I decided to fix a large number of 'duplicate title' warnings being reported in GWT "HTML Suggestions" -- these were for URLs which differed only in parameter case, and which had Canonical tags, but were still reported as dups in GWT.
My traffic had been steady at about 1000 clicks/day.
At midnight on 2/10, google traffic completely halted, down to 11 clicks/day.
I submitted a recon request and was told 'no manual penalty'
Also, the 'sitemap' indexes in GWT showed 'pending' for 24x7 starting then.
By about the 18th, the 'duplicate titles' count dropped to about 600 or so... the next day traffic hopped right back to about 800 clicks/day - for a week - then stopped again, down to 10/day, a week later, on the 26th.
I then noticed that GWT was reporting 20K page-not found errors - this has now grown to 35K such errors!
I realized that bogus internal links were being generated as I failed to disable the PHP warning messages.... so I disabled PHP warnings and fixed what I thought was the source of the errors.
However, the not-found count continues to climb -- and I don't know where these bad internal links are coming from, because the GWT report lists these link sources as 'unavailable'.
I'v been through a similar problem last year and it took months (4) for google to digest all the bogus pages ad recover. If I have to wait that long again I will lose much $$.
Assuming that the large number of 404 internal errors is the reason for the sudden shutoff...
How can I a) verify the source of these internal links, given that google says the source pages are 'unavailable'..
Most critically, how can I do a 'RESET" and have google re-spider my site -- or block the signature of these URLs in order to get rid of these errors ASAP??
thanks
-
Hello Rand, I've been facing a similar problem with my site. I'd really appreciate your response here - http://www.seomoz.org/q/help-fixing-the-traffic-drop-that-started-on-4-september-2012.
-
I wouldn't feel too confident that the numbers and dates Google's showing you are precise or accurate. In fact, we've seen times when GWMT is considerably off. I'd watch how Google crawls your site and look at search traffic to your pages - those are likely leading indicators that things are/will be fixed.
-
Thanks for the replies guys - - I had run Xenu on the site and it found no broken links... but still GWT error count continues to climb, and as of today
Google released a MUCH improved timeline view for the error count --- problem is, it's still showing 58K errors as of yesterday and climbing, long after I fixed them - and it wont show me where it thinks the source is...
These errors are all on internal pages BTW..
Heres the new google view
http://awesomescreenshot.com/0ef1gy6c7
The new GUI also includes a way to mark errors 'fixed' -- one by one!! I need to mark 60 thousand at once!
Also I can see the date these errors started appearing and it just doesnt make sense given that is the day my traffic started reappearing as well..
-
I agree with Rand's suggestions. I just ran a Screaming Frog crawl of the whole site on 10,233 links, 8997 URLs and got no 404s. So I think it's pretty safe to assume you've fixed the 404 issue. Here's the output of the crawl in case you'd like it for a reference: http://www.sendspace.com/file/7zui0v
I'd say:
- Definitely clean up and resubmit your XML sitemap
- Double check your backlink profile with Open Site Explorer and MajesticSEO to be sure that there aren't sites linking to URLs that no longer exist. If you find any of these make sure to 301 redirect them. Just take all the target URLs and dump them into Screaming Frog in list mode. All the links from OSE point to your homepage so they are not an issue, I don't have access to Majestic right now so I couldn't run those for you.
- You can now Submit pages in Google Webmaster Tools as well in the Fetch as Googlebot section. So you may consider submitting some of the new pages the site generates in addition to your reconsideration request to help get Google to re-crawl and find the 404s are gone.
Good luck man and please let us know if nothing changes after you implement these fixes.
-Mike
-
Hi Mark - wow, sounds really rough. I've got a few suggestions:
- First off, you need to make 100% sure that you've actually fixed the issue and that the internal links are pointing to the right places AND any old URLs that may have had internal/external links are either rel=canonicaling or 301 redirecting to the correct, updated locations.
- You might try using a few tools to verify this, including the SEOmoz Crawl Test http://pro.seomoz.org/tools/crawl-test and Screaming Frog: http://www.screamingfrog.co.uk/seo-spider/
- When you are ready, submit new XML Sitemaps to Google with the proper URLs. Make sure you've deleted/removed your old ones.
- You can also send the reconsideration request again, indicating that while you're aware this isn't a penalty, you have realized some technical/navigation issues on the site and believe you've now fixed these.
Hope this helps and wish you the best of luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content and 404 errors
I apologize in advance, but I am an SEO novice and my understanding of code is very limited. Moz has issued a lot (several hundred) of duplicate content and 404 error flags on the ecommerce site my company takes care of. For the duplicate content, some of the pages it says are duplicates don't even seem similar to me. additionally, a lot of them are static pages we embed images of size charts that we use as popups on item pages. it says these issues are high priority but how bad is this? Is this just an issue because if a page has similar content the engine spider won't know which one to index? also, what is the best way to handle these urls bringing back 404 errors? I should probably have a developer look at these issues but I wanted to ask the extremely knowledgeable Moz community before I do 🙂
Technical SEO | | AliMac260 -
Xml sitemaps giving 404 errors
We have recently made updates to our xml sitemap and have split them into child sitemaps. Once these were submitted to search console, we received notification that the all of the child sitemaps except 1 produced 404 errors. However, when we view the xml sitemaps in a browser, there are no errors. I have also attempted crawling the child sitemaps with Screaming Frog and received 404 responses there as well. My developer cannot figure out what is causing the errors and I'm hoping someone here can assist. Here is one of the child sitemaps: http://www.sermonspice.com/sitemap-countdowns_paged_1.xml
Technical SEO | | ang0 -
Max Number of 301 Redirections?
Hi, We currently made a re-design of a website and we changed all our urls to make them shorter. I made more than 300 permanent redirections but plenty more are needed since WMT is showing some more 404s from old urls that I hadn't seen because they were dynamic. The question is, please, is there a limit? I think we have more than 600 already. We don't want to create a php commando to redirect all the old ones to our home, we are redirecting them to their correspondent url. By the way, Im doing them with the 301 method in .htaccess. Thanks in advance.
Technical SEO | | Tintanus0 -
Big Increase in 404 Errors after Google Custom Search Engine Install on Website
My URL is: http://www.furniturefashion.comHi forum.I recently installed a Custom Google Search Engine (https://www.google.com/cse/) on my blog about ten days ago. Since then my 404 errors in Webmaster Tools has skyrocketed by several thousand. I had not had an issue before. Once it was installed the 404 errors started appearing. What's interesting is that all the errors have the URL then the word "undefined" at the end. I have attached a screen shot from my Webmaster Tools dashboard. Also, there are a few examples below of what the URLs are that have the 404 errors.wood_closet_organizer_to_improve_space_utilization/undefinedsmall-sweet-10-inspiring-small-kitchen-designs/undefined Has anyone had this issue? I very much want the search engine on my site, but not at the expense of several thousand 404 errors. My site queries has been going down since the installation of the custom search engine. Here is some of the code that I have below that I took off my site doing a "view source". Any help would be greatly appreciated.href='http://cdn.furniturefashion.com/wp-content/plugins/google-custom-search/css/smoothness/jquery-ui-1.7.3.custom.css?ver=3.9.2' type='text/css' media='all' />rel='stylesheet' id='gsc_style_search_bar-css' href='http://www.google.com/cse/style/look/minimalist.css?ver=3.9.2' type='text/css' media='all' />rel='stylesheet' id='gsc_style_search_bar_more-css' href='http://cdn.furniturefashion.com/wp-content/plugins/google-custom-search/css/gsc.css?ver=3.9.2' type='text/css' media='all' />< uXRSEkC
Technical SEO | | will21120 -
404 Best Practices
Hello All, So about 2 months ago, there was a massive spike in the number of crawl errors on my site according to Google Webmaster tools. I handled this by sending my webmaster a list of the broken pages with working pages that they should 301 redirect to. Admittedly, when I looked back a couple weeks later, the number had gone down only slightly, so I sent another list to him (I didn't realize that you could 'Mark as fixed' in webmaster tools) So when I sent him more, he 301 redirected them again (with many duplicates) as he was told without really digging any deeper. Today, when I talked about more re-directs, he suggested that 404's do have a place, that if they are actually pages that don't exist anymore, then a ton of 301 re-directs may not be the answer. So my two questions are: 1. Should I continue to relentlessly try to get rid of all 404's on my site, and if so, do I have to be careful not to be lazy and just send most of them to the homepage. 2. Are there any tools or really effective ways to remove duplicate 301 redirect records on my .htaccess (because the size of it at this point could very well be slowing down my site). Any help would be appreciated, thanks
Technical SEO | | CleanEdisonInc0 -
Do rss feeds help seo?
If we put relevant RSS feeds on a site, will it help the SEO value? Years ago, I shied away from RSS feeds because they slowed the site down and I didn't like relying on them. However, the past couple years, the Internet has become better, especially in Alaska.
Technical SEO | | manintights280 -
How can i increase my website traffic
Hello, my boss has decide a build website we have more than 12500 products in ourwebsite its mtscellular.com, im new as seo but im confused and need help i want to know how somebody help me to increase my website traffic
Technical SEO | | jimmylora0 -
REL Canonical Error
In my crawl diagnostics it showing a Rel=Canonical error on almost every page. I'm using wordpress. Is there a default wordpress problem that would cause this?
Technical SEO | | mmaes0