Thousands of 404s
-
Hi there,
I'm working on a site that has a ridiculous number of 404s being returned by webmaster tools. We believe this was because there was an onpage error that was amending the urls and adding in folders that shouldn't have been in a big spiral i.e. /salons/uk/teeth became something like /salons/uk/teeth/salons/edinburgh/hair/teeth...
Anyway, we think the issue is now sorted, but these pages were indexed it seems, and so it looks like Google is still searching for them when it crawls the site. What's my best move? It's the sheers volume (over 13,000) that has me concerned so I thought it best to seek some expert advice before continuing.
Thanks in advance!
-
As it's all sorted now, I really wouldn't worry about them too much. You can use the remove URL functionality in WMT, but this is a manual process so I wouldn't do this. If I were in your position, I'd probably just let the pages keep 404ing'. After a bit, Google will usually stop trying to recrawl the 404 pages. Right now they are probably trying to recrawl incase the 404 was an accident.
If it's causing a bandwidth problem, you can solve with a robots.txt as suggested earlier.
-
Hi Philip!
If these URL's are already indexed, you should 301 Redirect them to the right URL (if they by chance have some inbound links). You could also try the URL removal tool from Google (see https://support.google.com/webmasters/answer/1663416) if all you want is to get rid of them.
Good luck, hope this helps.
//Anders
-
Hi Philip,
If all the urls have the same URL pattern, I would give it a try adding the structure to the robots.txt so you'll prevent Google from crawling the pages. Even better would be if you could add the noindex tags to the page.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there a way to set up 301 auto redirects from 404s
some of our pages under a specific website section gets deleted from another data source and we want to resolve the problem of 404s can we set up automated 301 redirects to the main page as soon as one of these pages are deleted
Technical SEO | | lina_digital2 -
404s still showing in GWT
Hi, My client recently undertook a site migration. Since the new site's gone live GWT has highlighted over 2000 not found errors. These were fixed nearly 2 weeks ago and they're still being listed in GWT. Do I have to wait for Google to re-crawl the page before they're removed from the list? Or do I need to go through the list, individually check them and mark them as fixed? Any help would be appreciated. Thanks
Technical SEO | | ChannelDigital0 -
1,300,000 404s
Just moved a WordProcess site over to a new host and skinned it. Found out after the fact that the site had been hacked - the db is clean. I did notice at first there were a lot of 404s being generated, so I setup a script to capture and then return a 410 page gone - and then the plan was to submit them to have them removed from the index - thinking there was a manageable number But, when I looked at Google WebMaster Tools there was over 1,300,000 404 errors - see attachment. My puny attempt to solve this problem seems to need more of an industrial size solution. My question, is that what would be the best way to deal with this? Not all of the pages are indexed in google - only 637 index but you can only see about 150 in the index. Where bing is another story saying that over 2,700 pages index but only can see about 200. How is this affecting any future rankings - they do not rank well, as I found out because of very slow page load speed and of course the hacks? The link profile looking at Google is OK, and there are no messages in Google Webmaster tools. am5cMz2
Technical SEO | | Runner20090 -
Removing thousands of shady backlinks?
Hey guys, We've been hired to redesign a website that has thousands of backlinks created by a (possibly) shady offshore company, and I'm wondering if anyone out there has experience dealing with a deletion of this size and type. Is it as simple as just disavowing the whole lot? Thanks, Jason
Technical SEO | | JKorolenko0 -
Google WMT continues reporting fixed 404s - why?
I work with a news site that had a heavy restructuring last spring. This involved removing many pages that were duplicates, tags, etc. Since then, we have taken very careful steps to remove all links coming into these deleted pages, but for some reason, WMT continues to report them. By last August, we had cleared over 10k 404s to our site, but this lasted only for about 2 months and they started coming back. The "linked from" gives no data, and other crawlers like seomoz aren't detecting any of these errors. The pages aren't in the sitemap and I've confirmed that they're not really being linked from from anywhere. Why do these pages keep coming back? Should I even bother removing them over and over again? Thanks -Juanita
Technical SEO | | VoxxiVoxxi0 -
Organic Traffic Dropping. 404s found. Is this it?
My organic traffic has dropped 20% in the past month. I noticed some major 404 errors today on a very commonly clicked link on our website. I had the developers fix the error today, but then I thought I'd look in Google Webmaster Tools. That's when I saw the attached picture. 2454 404 errors were found. We only have 1500 pages in the entire Google index. Could this massive amount of 404 errors attribute to this drop in organic traffic? dU240 dU240
Technical SEO | | davewjones0 -
Disallow: /search/ in robots but soft 404s are still showing in GWT and Google search?
Hi guys, I've already added the following syntax in robots.txt to prevent search engines in crawling dynamic pages produce by my website's search feature: Disallow: /search/. But soft 404s are still showing in Google Webmaster Tools. Do I need to wait(it's been almost a week since I've added the following syntax in my robots.txt)? Thanks, JC
Technical SEO | | esiow20130 -
Images on page appear as 404s to Googlebot
When I fetch my website as Googlebot it returns 404s for all the images on the page. This despite the fact that each image is hyperlinked! What could be causing this issue? Thanks!
Technical SEO | | Netpace0