Why is Google Webmaster Tools reporting a massive increase in 404s?
-
Several weeks back, we launched a new website, replacing a legacy system moving it to a new server. With the site transition, webroke some of the old URLs, but it didn't seem to be too much concern. We blocked ones I knew should be blocked in robots.txt, 301 redirected as much duplicate data and used canonical tags as far as I could (which is still an ongoing process), and simply returned 404 for any others that should have never really been there.
For the last months, I've been monitoring the 404s Google reports in Web Master Tootls (WMT) and while we had a few hundred due to the gradual removal duplicate data, I wasn't too concerned. I've been generating updated sitemaps for Google multiple times a week with any updated URLs. Then WMT started to report a massive increase in 404s, somewhere around 25,000 404s per day (making it impossible for me to keep up). The sitemap.xml has new URL only but it seems that Google still uses the old sitemap from before the launch. The reported sources of 404s (in WMT) don't exist anylonger. They all are coming from the old site.
I attached a screenshot showing the drastic increase in 404s. What could possibly cause this problem?
-
Thank you for both responses...
Nakul--
I have been following everything exactly as you have described. In general the goal during the development was to keep changes to an absolute minimum. This has not always been possible.
The majority of external links have been 301 redirected or in cases where the new server responds to two differnet URLs for the same content a canonical tag has been added.
I have noticed that 99% of the reported URLs are former internal links. The reported 404s are completely out of proportion (194k vs less than 5k pages in the new xml sitemap).
I am really worried. Is there anything else I can do beside monitoring and hopping?
How long does it typically take to for "Things have to work their way out of its system."?
Is it possible that Google is somehow accessing the old IP address (although the DNS records for the domain have changed)? We left the old server alive and planning to shut it down after the second site has been moved away from it.
Thanks,
Adam
-
Agreed; it could an after effect and stems from inbound URLs to your site from other sites. That's what the majority of the 404s I see in GWT come from (vs being bad pages within my site).
Google probably isn't using the old sitemap if you gave them a new one. What could be happening is that it still needs to "reorganize" and reconcile your old URLs and new URLs. The indexed pages don't just disappear overnight or get replaced immediately because of a site map change. Things have to work their way out of its system.
If there's specific URLs you want to try to remedy immediately, look into the GWT Remove URL option under the optimization section.
-
What I'd suggest doing is randomly revising some of those 404's that appear and check whether they should indeed be 404s. Are there any bulk rules / wildcard 301s you can implement to redirect the traffic for 3-6 months ?
These URLs are usually found from external links to your website. When you click on a detail of any of the reported 404s, it tells you what the error details are, whether this link is in the sitemap or where it is linked from. You'd realize in most cases it's linked from somewhere. If it's an internal link, correct it. If it's external, do you think the webmaster might update it if you contact them or is it easier to just set a 301, retaining the SEO value ?
I hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript content not being indexed by Google
I thought Google has gotten better at picking up unique content from javascript. I'm not seeing it with our site. We rate beauty and skincare products using our algorithms. Here is an example of a product -- https://www.skinsafeproducts.com/tide-free-gentle-he-liquid-laundry-detergent-100-fl-oz When you look at the cache page (text) from google none of the core ratings (badges like fragrance free, top free and so forth) are being picked up for ranking. Any idea what we could do to have the rating incorporated in the indexation.
Intermediate & Advanced SEO | | akih0 -
Is there a tool to find out if a URL has been deemed "SPAM" by GOOGLE
I am currently doing a link audit on one of my sites and I am coming across some links that appear to be spam. Is there a tool that I can plug their URL into to see if they have been deemed spam by GOOGLE?
Intermediate & Advanced SEO | | Mozd0 -
My indexed pages count is shrinking in webmaster tools. Is this normal ?
I noticed that our total # of indexed pages dropped recently by a substantial amount (see chart below) Is this normal? http://imgur.com/4GWzkph Also, 3 weeks after this started dropping, we got a message on increased # of crawl errors and found that a site update was causing 300+ new 404s. could this be related ?
Intermediate & Advanced SEO | | znotes0 -
Sitemaps recommend by google
Google in it guideline recommends to create a sitemap. Do they means a /sitemap.xml or does it need to be sitemap directly on the website ? Does it make any difference ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Google Webmaster Now Shows YourMost Recent Links
I just saw this story today about a new Google Webmaster feature which lets you download a file of the most recent links. http://searchengineland.com/google-now-shows-you-your-most-recent-links-127903 I downloaded the file today and I already discovered a major site issue. Our site blog was completely duplicated on a secondary domain we own and Google was showing that site as recent links. I already emailed the dev team to fix this pronto. Anybody else using this new feature and perhaps can share if it helps you in any way.
Intermediate & Advanced SEO | | irvingw1 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
So What On My Site Is Breaking The Google Guidelines?
I have a site that I'm trying to rank for the Keyword "Jigsaw Puzzles" I was originally ranked around #60 or something around there and then all of a sudden my site stopped ranking for that keyword. (My other keyword rankings stayed) Contacted Google via the site reconsideration and got the general response... So I went through and deleted as many links as I could find that I thought Google may not have liked... heck, I even removed links that I don't think I should have JUST so I could have this fixed. I responded with a list of all links I removed and also any links that I've tried to remove, but couldn't for whatever reasons. They are STILL saying my website is breaking the Google guidelines... mainly around links. Can anyone take a peek at my site and see if there's anything on the site that may be breaking the guidelines? (because I can't) Website in question: http://www.yourjigsawpuzzles.co.uk UPDATE: Just to let everyone know that after multiple reconsideration requests, this penalty has been removed. They stated it was a manual penalty. I tried removing numerous different types of links but they kept saying no, it's still breaking rules. It wasn't until I removed some website directory links that they removed this manual penalty. Thought it would be interesting for some of you guys.
Intermediate & Advanced SEO | | RichardTaylor0 -
Is Google mad at me for redirecting...?
Hi, I have an e-commerce website that sells unique items (one of a kind). We have hundreds of items and the items are rapidly sold. Up till now I kept the sold items under our "sold items" section but it started to get back at me as we have more "sold" than non sold and we are having duplication problems (the items are quite similar besides to sizes etc.). What should we do? Should we redirect 100 pages each week? Will Google be upset with that? (for driving it crazy) Thanks
Intermediate & Advanced SEO | | BeytzNet0