Crawl Errors In Webmaster Tools
-
Hi Guys,
Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers.
I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site.
The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6.
The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:-
In sitemaps : 123
Not Found : 2,079
Restricted by robots.txt 1
Unreachable: 2
I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site.
How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient?
Kind Regards
Neil
-
That's a bit of a pain.
If anything tell them to set up a custom informative 404 error page that will at least direct them somewhere else and not bail from the site.
-
Hi Kieran,
Thank you so much for your answer. The issue is I don't have access to the site admin and can only suggest changes.
Ill suggest what you have put above, and push ahead with the link building and see what happens. Told the client its a competitive niche especially this time of year, my boiler always seems to pack in before winter, does seem to have a settled around page 6, so you may be right.
Just didn't want to rule out if this was a possible penalization from Google.
Kind Regards
Neil
-
Not all errors are bad but for instance if you are getting 404 errors and the actual pages do not exist anymore then you should do 301 redirects on your .htacess to point Google to the right page. Doing this will sort out a lot of the errors. But if you are getting 404s and the pages are there then that is a bigger problem
Restricted by robots is not an error as this is telling you that you don't want somewhere crawled. Check the file to see if it is correct in what it is restricting.
How often are you pushing your sitemap out to Google. If you are as active with pages as your posts suggest I would think about submitting a new one or automating its creation.
If you are actively working the content on the sties there can often be this level of SERPS bouncing.but if you continue with the content I wouldn't worry too much about the errors for the moment and do the housekeeping above. There are very few completely spotless sites out there and the Google Webmaser tools and even the SEOMOZ tools here will always give you some level of errors.
hope this helps
Kieran
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Www vs non www - Crawl Error 902
I have just taken over admin of my company website and I have been confronted with crawl error 902 on the existing campaign that has been running for years in Moz. This seems like an intermittent problem. I have searched and tried to go over many of the other solutions and non of them seem to help. The campaign is currently set-up with the url http://companywebsite.co.uk when I tried to do a Moz manual crawl using this URL I got an error message. I changed the link to crawl to http://www.companywebsite.co.uk and the crawl went off without a hitch and im currently waiting on the results. From testing I now know that if i go to the non-www version of my companies website then nothing happens it never loads. But if I go to the www version then it loads right away. I know for SEO you only want 1 of these URLS so you dont have duplicate content. But i thought the non-www should redirect to the www version. Not just be completely missing. I tried to set-up a new campaign with the defaults URL being the www version but Moz automatically changed it to the non-www version. It seems a cannot set up a new campaign with it automatically crawling the www version. Does it sound like im out the right path to finding this cause? Or can somebody else offer up a solution? Many thanks,
Technical SEO | | ATP
Ben .0 -
Is there a way to see Crawl Errors older than 90 days in Webmaster Tools?
I had some big errors show up in November, but I can't see them anymore as the history only goes back 90 days. Is there a way to change the dates in Webmaster Tools? If not, is there another place I'd be able to get this information? We migrated our hosting to a new company around this time and the agency that handled it for us never downloaded a copy of all the redirects that were set-up on the old site.
Technical SEO | | b4cab0 -
Why are my 301 redirects and duplicate pages (with canonicals) still showing up as duplicates in Webmaster Tools?
My guess is that in time Google will realize that my duplicate content is not actually duplicate content, but in the meantime I'd like to get your guys feedback. The reporting in Webmaster Tools looks something like this. Duplicates /url1.html /url2.html /url3.html /category/product/url.html /category2/product/url.html url3.html is the true canonical page in the list above._ url1.html,_ and url2.html are old URLs that 301 to url3.html. So, it seems my bases are covered there. _/category/product/url.html _and _/category2/product/url.html _ do not redirect. They are the same page as url3.html. Each of the category URLs has a canonical URL of url3.html in the header. So, it seems my bases are covered there as well. Can I expect Google to pick up on this? Why wouldn't it understand this already?
Technical SEO | | bearpaw0 -
To avoid errors in our Moz crawl, we removed subdomains from our host. (First we tried 301 redirects, also listed as errors.) Now we have backlinks all over the web that are broken. How bad is this, from a pagerank standpoint?
Our MOZ crawl kept telling us we had duplicate page content even though our subdomains were redirected to our main site. (Pages from Wineracks.vigilantinc.com were 301 redirected to vigilantinc.com/wineracks.) Now, to solve that problem, we have removed the wineracks.vigilantinc.com subdomain. The error report is better, but now we have broken backlinks - thousands of them. Is this hurting us worse than the duplicate content problem?
Technical SEO | | KristyFord0 -
My website pages are not crawled, what to do?
Hi all. I have made some changes on the website so i like to crawled them by the search engines Google especially. I have made these changes around 2 weeks ago. I have submitted my website on good bookmarking websites. Also i used a tool available in Google webmasters "Fetch as Google", Resubmitted a sitemap.xml. Still my pages are not crawled your opinion please. Thanks
Technical SEO | | lucidsoftech0 -
Increase in authorization permission errors error after site switch
We launched our new site 2 days ago , since site was down for 12 hours for maintenance, we saw google webmaster tool shows this error . Since then google hasnt crawled, its been 36 hours. Do we need to do anyting? We have close to a million page google crawled before and I am wondering if this will effect anything.
Technical SEO | | tpt.com0 -
Keyword Difficulty Tool
Hi Mozzers! Randfishkin just posted yesterday a very nice important and helpfull post, about keyword difficulty. I will be happy, if you can write here the metrics from reports of keyword difficulty, to know more about position of our website on SERP, and to know more what to engage if someone is ranking higher than me, with same metrics of the report of keyword difficulty. It would be very nice, if we talk on this topic here about keyword difficulty how to's. Thanks
Technical SEO | | leadsprofi0 -
Issue with 'Crawl Errors' in Webmaster Tools
Have an issue with a large number of 'Not Found' webpages being listed in Webmaster Tools. In the 'Detected' column, the dates are recent (May 1st - 15th). However, looking clicking into the 'Linked From' column, all of the link sources are old, many from 2009-10. Furthermore, I have checked a large number of the source pages to double check that the links don't still exist, and they don't as I expected. Firstly, I am concerned that Google thinks there is a vast number of broken links on this site when in fact there is not. Secondly, why if the errors do not actually exist (and never actually have) do they remain listed in Webmaster Tools, which claims they were found again this month?! Thirdly, what's the best and quickest way of getting rid of these errors? Google advises that using the 'URL Removal Tool' will only remove the pages from the Google index, NOT from the crawl errors. The info is that if they keep getting 404 returns, it will automatically get removed. Well I don't know how many times they need to get that 404 in order to get rid of a URL and link that haven't existed for 18-24 months?!! Thanks.
Technical SEO | | RiceMedia0