Why is Google Webmaster Tools reporting a massive increase in 404s?
-
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there.
For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site.
I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem?
-
Thank you for both responses...
Nakul--
I have been following everything exactly as you have described. In general the goal during the development was to keep changes to an absolute minimum. This has not always been possible.
The majority of external links have been 301 redirected or in cases where the new server responds to two differnet URLs for the same content a canonical tag has been added.
I have noticed that 99% of the reported URLs are former internal links. The reported 404s are completely out of proportion (194k vs less than 5k pages in the new xml sitemap).
I am really worried. Is there anything else I can do beside monitoring and hopping?
How long does it typically take to for "Things have to work their way out of its system."?
Is it possible that Google is somehow accessing the old IP address (although the DNS records for the domain have changed)? We left the old server alive and planning to shut it down after the second site has been moved away from it.
Thanks,
Adam
-
Agreed; it could an after effect and stems from inbound URLs to your site from other sites. That's what the majority of the 404s I see in GWT come from (vs being bad pages within my site).
Google probably isn't using the old sitemap if you gave them a new one. What could be happening is that it still needs to "reorganize" and reconcile your old URLs and new URLs. The indexed pages don't just disappear overnight or get replaced immediately because of a site map change. Things have to work their way out of its system.
If there's specific URLs you want to try to remedy immediately, look into the GWT Remove URL option under the optimization section.
-
What I'd suggest doing is randomly revising some of those 404's that appear and check whether they should indeed be 404s. Are there any bulk rules / wildcard 301s you can implement to redirect the traffic for 3-6 months ?
These URLs are usually found from external links to your website. When you click on a detail of any of the reported 404s, it tells you what the error details are, whether this link is in the sitemap or where it is linked from. You'd realize in most cases it's linked from somewhere. If it's an internal link, correct it. If it's external, do you think the webmaster might update it if you contact them or is it easier to just set a 301, retaining the SEO value ?
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Completely redesigned webmaster - set up new site in Google Webmaster Tools, or keep existing??
Hi - our company just completely redesigned our website and went from a static HTML site to a PHP based site, so every single URL has changed (around 1500 pages). I put the same verification code into the new site and re-verified but now Google is listing tons and tons of 404's. Some of them are really old pages that haven't existing in a long time, it would literally be impossible to create all the redirects for the 404s it's pulling. Question - when completely changing a site like this, should I have created a whole new Search Console? Or did I do the right thing by using the existing one?
Intermediate & Advanced SEO | | Jenny10 -
Google indexing wrong pages
We have a variety of issues at the moment, and need some advice. First off, we have a HUGE indexing issue across our entire website. Website in question: http://www.localsearch.com.au/ Firstly
Intermediate & Advanced SEO | | localdirectories
In Google.com.au, if you search for 'plumbers gosford' (https://www.google.com.au/#q=plumbers+gosford), the wrong page appears - in this instance, the page ranking should be http://www.localsearch.com.au/Gosford,NSW/Plumbers I can see this across the board, across multiple locations. Secondly
Recently I've seen Google reporting in 'Crawl Errors' in webmaster tools URLs such as:
http://www.localsearch.com.au/Saunders-Beach,QLD/Electronic-Equipment-Sales-Repairs&Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA This is an invalid URL, and more specifically, those query strings seem to be referrer queries from Google themselves: &Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA Here's the above example indexed in Google: https://www.google.com.au/#q="AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA" Does anyone have any advice on those 2 errors?0 -
Webmaster Tools Internal Links
Hi all, I have around 400 links in the navigation menu (site-wide) and when I use webmaster tools to check for internal links to each page; some have as many as 250K and some as little as 200. Shouldn't the number of internal links for pages found in the navigation menu be relatively the same? Or is Google registering more internal links for pages linked closer to the top of the code Thanks!
Intermediate & Advanced SEO | | Carlos-R0 -
Can anyone explain sudden drop in impressions. Webmaster tools screenshot attached.
I posted here recently concerned about a drop in rankings. I'm yet to figure out what happened and thought I'd post a screenshot of the loss of impressions to see if anyone can help? jeZO1r1.png
Intermediate & Advanced SEO | | SamCUK0 -
Effect of I-Frame on Google Rank
My commercial real estate web site (www.nyc-officespace-leader.com) allows visitors to search for office space listings. The site sources listings through a third party and they are displayed in an i-frame. The i-frame directs visitors to listing pages such as: http://listings.nyc-officespace-leader.com/getspace.mpl?sp_id=A0173921&cust_id=offspldr Atleast 10,000 of these pages have backlinks to my site. My question is the following: Could these tens of thoudands of alpha numeric URLs be detrimental to my sites ranking on Google after the Panda/Penguin updates? SIte traffic dropped from 7,000 per month to about 3,300 after the April Google update. Rewriting content for dozens of pages and adding a blog have only somewhat mitigated the negative effects of Panda/Penguin. Could Google be viewing these links from the third party lisitng provider as a negative when they viewed these links as a plus before? Any downside to removing the third party links and parsing these listings from landlord websited and displaying them as part of my site with their own URL, title tag, description tag? Obviously the new URLS would not be alphanumeric. If these links have not caused the drop in traffic last April, what could be responsible? Thanks in advance for your opinion!!! Alan
Intermediate & Advanced SEO | | Kingalan10 -
How to determine URL Parameters in Google Webmaster
Hi there! I have a new website with so many duplicate meta titles and descriptions because of its expanded features from the e-commerce shopping cart that I am using like mobile website, product sorting, etc. Aside from canonical, is it advisable to use the URL parameters from Google webmaster tools to disallow crawling of mobile website and other parameters like, "parent", "catalogsetview", "pcsid", "pg" "mode". I appreciate and advise. 🙂 Thanks!
Intermediate & Advanced SEO | | paumer800 -
Fixing Google Places once Banned
I have a lot of clients who have somehow botched up their Google Places listing, and now are not showing up in local search results. In one particular case, they were using 2 different Gmail accounts and submitted their listing twice by accident. It appears Google has banned them from local search results. How does one undo steps like this and start fresh? Thanks!
Intermediate & Advanced SEO | | ocsearch0 -
HTTP Errors in Webmaster Tools
We recently added a 301 redirect from our non-www domain to the www version. As a result, we now have tons of HTTP errors (403s to be exact) in Webmaster Tools. They're all from over a month ago, but they still show up. How can we fix this?
Intermediate & Advanced SEO | | kylesuss0