Remove more than 1000 crawl errors from GWT in one day?
-
In google webmasters tools you have the feature "Crawl Errors". This one displays the top 1000 crawl errors google have on your site.
I have around 16k crawl errors at the moment, which all are fixed. But i can only mark 1000 of them as fixed each day/each time google crawls the site. (This as it only displays top 1000 errors. When i have marked those as fixed it won't show other errors for a while.)
Does anyone know if it's possible to mark ALL errors as fixed in one operation?
-
Google indexed around 20k useless URL's due to mediawiki's insane amounts of URL's that is generated by not using "Short URL's".
It was resolved when we moved the wiki to another location, added the short URL's.
We have just redirected everything. (301).
-
So google indexed more than 16000 pages on your site and now you do what?
Did you just remove them (404) or redirect them (301)?
-
No problem at all, had a wiki up and running without the "short URL's". So Google had ~19k errors on this one because of too long/complicated URL's. Removed it, problem solved and all errors resolved.
-
Hi Host1
It is not possible! You can only mark 1000 errors as fixed a day.
May i ask you how you fixed 16.000 errors at once?
Regards
Alsvik
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GWT Images Indexing
Hi guys! How does normally take to get Google to index the images within the sitemap? I recently submitted a new, up to date sitemap and most of the pages have been indexed already, but no images have. Any reason for that? Cheers
Technical SEO | | PremioOscar0 -
Strange 404 Error(Answered)
Hi everyone! I recently took over a new account and I was running an initial crawl on the site and a weird 404 error popped up. http://www.directcolors.com/products/liquid-colored-antique/top
Technical SEO | | rblake
http://www.directcolors.com/applications/concrete-antiquing/top
http://www.directcolors.com/applications/concrete-countertops/top I understand that the **top **could be referring to an actual link that brings users to the top of a page, but on these pages there is no such link. Am I missing something?1 -
When Should I Ignore the Error Crawl Report
I have a handful of pages listed in the Error Crawl Report, but the report isn't actually showing anything wrong with these pages. I am double checking the code on the site and also can't find anything. Should I just move on and ignore the Error Crawl Report for these few pages?
Technical SEO | | ChristinaRadisic0 -
Removing a URL from Search Results
I recently renamed a small photography company, and so I transferred the content to the new website, put a 301-redirect on the old website URL, and turned off hosting for that website. But when I search for certain terms that the old URL used to rank highly for (branded terms) the old URL still shows up. The old URL is "www.willmarlowphotography.com" and when you type in "Will Marlow" it often appears in 8th and 9th place on a SERP. So, I have two questions: First, since the URL no longer has a hosting account associated with it, shouldn't it just disappear from SERPs? Second, is there anything else I should have done to make the transition smoother to the new URL? Thanks for any insights you can share.
Technical SEO | | williammarlow0 -
X-cart page crawling question.
I have an x-cart site and it is showing only 1 page being crawled. I'm a newbie, is this common? Can it be changed? If so, how? Thanks.
Technical SEO | | SteveLMCG0 -
How to avoid 404 errors when taking a page off?
So... We are running a blog that was supposed to have great content. Working at SEO for a while, I discovered that is too much keyword stuffing and some SEO shits for wordpress, that was supposed to rank better. In fact. That worked, but I'm not getting the risk of getting slaped by the Google puppy-panda. So we decided to restard our blog from zero and make a better try. So. Every page was already ranking in Google. SEOMoz didn't make the crawl yet, but I'm really sure that the crawlers would say that there is a lot of 404 errors. My question is: can I avoid these errors with some tool in Google Webmasters in sitemaps, or shoud I make some rel=canonicals or 301 redirects. Does Google penalyses me for that? It's kinda obvious for me that the answer is YES. Please, help 😉
Technical SEO | | ivan.precisodisso0 -
RSS Feed Errors in Google
We recently (2 months ago) launched RSS feeds for the category pages on our site. Last week we started seeing error pages in Webmaster Tools' Crawl Errors report pop up for feeds of old pages that have been deleted from the site, deleted from the sitemap, and not in Google's index since long before we launched the RSS feeds. Example: www.mysite.com/super-old-page/feed/ I checked and both the URL for the feed and the URL for the actual page are returning 404 statuses. www.mysite.com/super-old-page/ is also showing up in our Crawl Errors. Its been deleted for months but Webmaster Tools is very slow to remove the page from their Crawl Error report. Where is Google finding these feeds that never existed?
Technical SEO | | Hakkasan0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0