Remove more than 1000 crawl errors from GWT in one day?
-
In google webmasters tools you have the feature "Crawl Errors". This one displays the top 1000 crawl errors google have on your site.
I have around 16k crawl errors at the moment, which all are fixed. But i can only mark 1000 of them as fixed each day/each time google crawls the site. (This as it only displays top 1000 errors. When i have marked those as fixed it won't show other errors for a while.)
Does anyone know if it's possible to mark ALL errors as fixed in one operation?
-
Google indexed around 20k useless URL's due to mediawiki's insane amounts of URL's that is generated by not using "Short URL's".
It was resolved when we moved the wiki to another location, added the short URL's.
We have just redirected everything. (301).
-
So google indexed more than 16000 pages on your site and now you do what?
Did you just remove them (404) or redirect them (301)?
-
No problem at all, had a wiki up and running without the "short URL's". So Google had ~19k errors on this one because of too long/complicated URL's. Removed it, problem solved and all errors resolved.
-
Hi Host1
It is not possible! You can only mark 1000 errors as fixed a day.
May i ask you how you fixed 16.000 errors at once?
Regards
Alsvik
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
On our site by mistake some wrong links were entered and google crawled them. We have fixed those links. But they still show up in Not Found Errors. Should we just mark them as fixed? Or what is the best way to deal with them?
Some parameter was not sent. So the link was read as : null/city, null/country instead cityname/city
Technical SEO | | Lybrate06060 -
Wordpress 404 Errors
Hi Guys, One of my clients is scratching his head after a site migration. He has moved to wordpress and now GWT is creating weird and wonderful strange 404 errors. For example http://www.allsee-tech.com/digital-signage-blog/category/clients.html There are loads like the above which seem to be made up out of his blog and navigation http://www.allsee-tech.com/clients.html works! Any ideas? Is it a rogue plugin? How do we fix? Kind Regards Neil
Technical SEO | | nezona0 -
What does this error mean?
We recently merged our Google + & Google Local pages and sent a request to Webmaster tools to connect the Google + page to our website. The message was successfully sent. However, when clicking the 'Approve or reject this request' link, the following error message appears: 'Can't find associate request' Anyone know what we are doing incorrectly? Thanks in advance for any insight.
Technical SEO | | SEOSponge0 -
Removing URL Parentheses in HTACCESS
Im reworking a website for a client, and their current URLs have parentheses. I'd like to get rid of these, but individual 301 redirects in htaccess is not practical, since the parentheses are located in many URLs. Does anyone know an HTACCESS rule that will simply remove URL parantheses as a 301 redirect?
Technical SEO | | JaredMumford0 -
Drupal Updates = errors
We have worked diligently to correct our SEO Moz crawl diagnostic errors to below 20. However, at least twice now, our coder updates the drupal security warnings and BINGO- every time - our errors go sky high again. Any thoughts?
Technical SEO | | Stevej240 -
If multiple links on a page point to the same URL, and one of them is no-followed, does that impact the one that isn't?
Page A has two links on it that both point to Page B. Link 1 isn't no-follow, but Link 2 is. Will Page A pass any juice to Page B?
Technical SEO | | Jay.Neely0 -
Removing inbound Spam Links
Hello, Last February one of my clients websites was delisted. It turns out that some time ago that had attempted to launch a social network along time lines of ning. The project had fallen apart of the was still up. At some point spammers found it and started using it as part of a link farm. Once it was discovered, the subdomain it was posted on was removed and the website returned to search within 2 weeks. Last week, the website disappeared again OSE shows that in the last 2 months the website has got 2000 (There are about 16,000 total spam links) additional spam links now pointing and the root domain. On top of that, Google Webmaster Tools is reporting about 15,000 404 errors. I have blocked Google from crawling the path where the path were the spam pages used to be. If there a way to block the 1000s of inbound spam links?
Technical SEO | | Simple_Machines0 -
Page that has no link is being crawled
http://www.povada.com/category/filters/metal:Silver/nstart/1/start/1.htm I have no idea how the above page was even found by google but it seems that it is being crawled and Im not sure where its being found from. Can anyone offer a solution?
Technical SEO | | 13375auc30