Is there a way to see Crawl Errors older than 90 days in Webmaster Tools?
-
I had some big errors show up in November, but I can't see them anymore as the history only goes back 90 days. Is there a way to change the dates in Webmaster Tools? If not, is there another place I'd be able to get this information? We migrated our hosting to a new company around this time and the agency that handled it for us never downloaded a copy of all the redirects that were set-up on the old site.
-
What you also could do is run a crawl on your site with XENU or ScreamingFrog to find urls that return a 404 error. If they do they'll probably need a redirect. Next to that could you check the list of internal links within Google Webmaster Tools, if you know how the old structure looked like you'll be able to redirect them as well if they still show up.
-
I don’t think there is a way to move the dates but if the problem is still there it should be appearing in the issues section. If not that most probably means the problem has been resolved.
Hope this helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is Google Webmaster Tools showing 404 Page Not Found Errors for web pages that don't have anything to do with my site?
I am currently working on a small site with approx 50 web pages. In the crawl error section in WMT Google has highlighted over 10,000 page not found errors for pages that have nothing to do with my site. Anyone come across this before?
Technical SEO | | Pete40 -
Google Webmaster Tools: MESSAGE
Dear site owner or webmaster of http://www.enakliyat.com.tr/,
Technical SEO | | iskq
Some of your site's pages may be using techniques that do not comply with Google's Webmaster Guidelines.
On your site, in particular, does not provide an adequate level of innovation in low-quality unique content or set of pages. Examples of this type of thin affiliate pages, pages, bridge pages, it will automatically be created or copied content. For more information about the unique and interesting content, visit http://www.google.com/support/webmasters/bin/answer.py?answer=66361.
We recommend you to make the necessary changes to your site to fit your site's quality guidelines. After making these changes, please submit your site for reconsideration in Google's search results.
If you have questions about how to resolve this problem, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team **After this massege ve find our low quality pages and we added this urls on Robots.txt. Other than that, what can we do? ** **Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. **0 -
HTML Encoding Error
Okay, so this is driving me nuts because I should know how to find and fix this but for the life of me cannot. One of the sites I work for has a long-standing crawl error in Google WMT tools for the URL /a%3E that appears on nearly every page of the site. I know that a%3E is an improperly encoded > but I can't seem to find where exactly in the code its coming from. So I keep putting it off and coming back to it every week or two only to wrack my brain and give up on it after about an hour (since its not a priority and its not really hurting anything). The site in question is https://www.deckanddockboxes.com/ and some of the pages it can be found on are /small-trash-can.html, /Dock-Step-Storage-Bin.html, and /Standard-Dock-Box-Maxi.html (among others). I figured it was about time to ask for another set of eyes to look at this for me. Any help would be greatly appreciated. Thanks!
Technical SEO | | MikeRoberts0 -
Strange Webmaster Tools Crawl Report
Up until recently I had robots.txt blocking the indexing of my pdf files which are all manuals for products we sell. I changed this last week to allow indexing of those files and now my webmaster tools crawl report is listing all my pdfs as not founds. What is really strange is that Webmaster Tools is listing an incorrect link structure: "domain.com/file.pdf" instead of "domain.com/manuals/file.pdf" Why is google indexing these particular pages incorrectly? My robots.txt has nothing else in it besides a disallow for an entirely different folder on my server and my htaccess is not redirecting anything in regards to my manuals folder either. Even in the case of outside links present in the crawl report supposedly linking to this 404 file when I visit these 3rd party pages they have the correct link structure. Hope someone can help because right now my not founds are up in the 500s and that can't be good 🙂 Thanks is advance!
Technical SEO | | Virage0 -
Google not found errors in webmaster tool help
Hi, Google Webmaster tools sent me a few messages recently about the jump in the number of 'not found' errors. From 0 to 290 errors, ouch. I know what it's from but I think Google is seeing things. We developed another page/subdomain we're working on with links back to the root domain. Basically a complete list of articles page that lists each article and links back to the root domain. Not sure what Google is crawling but the links that would result in a 'not found' error aren't there. Will these disappear over time? Thanks for the help!
Technical SEO | | astahl110 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
How many days for a Backlink
Hi One week ago, i created a blog on wordpress added the url of my blog on google, bing and yahoo. In that blog i put a link of my webshop (the site im working on SEO) but when i checked the backlinks of my webshop (with seomoz tools and yahoo explorer) , the link from the blog still doesnt show. How many days it takes for a backlink to be registered ? Thanks
Technical SEO | | nipponx0 -
404 Error on Spider Emulators
I recently began working at a company called Uncommon Goods. I ran a few different spider emulators on our homepage (uncommongoods.com) and I saw a 404 Error on SEO-browser.com as well as URL errors on Summit Media's emulator and SEOMoz's crawler. It seems there is a serious problem here. How is this affecting our site from an SEO standpoint? What are the repercussions? Also, I know we have a lot of javascript on our homepage..is this causing the 404? Any advice would be much appreciated. Thanks! -Zack
Technical SEO | | znotes0