Crawl Errors In Webmaster Tools
-
Hi Guys,
Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers.
I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site.
The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6.
The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:-
In sitemaps : 123
Not Found : 2,079
Restricted by robots.txt 1
Unreachable: 2
I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site.
How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient?
Kind Regards
Neil
-
That's a bit of a pain.
If anything tell them to set up a custom informative 404 error page that will at least direct them somewhere else and not bail from the site.
-
Hi Kieran,
Thank you so much for your answer. The issue is I don't have access to the site admin and can only suggest changes.
Ill suggest what you have put above, and push ahead with the link building and see what happens. Told the client its a competitive niche especially this time of year, my boiler always seems to pack in before winter, does seem to have a settled around page 6, so you may be right.
Just didn't want to rule out if this was a possible penalization from Google.
Kind Regards
Neil
-
Not all errors are bad but for instance if you are getting 404 errors and the actual pages do not exist anymore then you should do 301 redirects on your .htacess to point Google to the right page. Doing this will sort out a lot of the errors. But if you are getting 404s and the pages are there then that is a bigger problem
Restricted by robots is not an error as this is telling you that you don't want somewhere crawled. Check the file to see if it is correct in what it is restricting.
How often are you pushing your sitemap out to Google. If you are as active with pages as your posts suggest I would think about submitting a new one or automating its creation.
If you are actively working the content on the sties there can often be this level of SERPS bouncing.but if you continue with the content I wouldn't worry too much about the errors for the moment and do the housekeeping above. There are very few completely spotless sites out there and the Google Webmaser tools and even the SEOMOZ tools here will always give you some level of errors.
hope this helps
Kieran
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WEBMASTER console: increase in the number of URLs we were blocked from crawling due to authorization permission errors.
Hi guys,I received this warning in my webmaster console: "Google detected a significant increase in the number of URLs we were blocked from crawling due to authorization permission errors." So i went to "Crawl Errors" section and i found such errors under "Access denied" status: ?page_name=Cheap+Viagra+Gold+Online&id=471 ?page_name=Cheapest+Viagra+Us+Licensed+Pharmacies&id=1603 and many happy URLs like these. Does anybody know what this is and where it comes from? Thanks in advance!
Technical SEO | | odmsoft0 -
404 Errors in WMT
Currently my website have about 10,000 404 errors for my site as wordpress is adding /feed/ to the end of all url in my website.. Should I restrict /feed/ from the robot txt?
Technical SEO | | thewebguy30 -
Can I have an http AND a https site on Google Webmaster tools
My website is https but the default property that was configured on Google WMT was http and wasn't showing me any information because of that. I added an https property for that, but my question is: do I need to delete the original HTTP or can I leave both websites?
Technical SEO | | Onboard.com0 -
Webmaster Tools - Clarification of what the top directory is in a calender url
Hi all, I had an issue where it turned out a calender was used on my site historically (a couple of years ago) but the pages were still present, crawled and indexed by google to this day. I want to remove them now from the index as it really clouds my analysis and as I have been trying to clean things up e.g. by turning modules off, webmaster tools is throwing up more and more errors due to these pages. Below is an example of the url of one of the pages: http://www.example.co.uk/index.php?mact=Calendar,m1a033,default,1&m1a033year=2084&m1a033month=3&m1a033returnid=59&page=59?phpMyAdmin=xxyyzz The closest question I have found on the topic in Seomoz is: http://www.seomoz.org/q/duplicate-content-issue-6 I want to remove all these pages from the index by targeting their top level folder. From the historic question above would I be right in saying that it is: http://www.example.co.uk/index.php?mact=Calendar I want to be certain before I do a directory level removal request in case it actually targets index.php instead and deindexes my whole site (or homepage at the very least). Thanks
Technical SEO | | Mitty0 -
Are 404 Errors a bad thing?
Good Morning... I am trying to clean up my e-commerce site and i created a lot of new categories for my parts... I've made the old category pages (which have had their content removed) "hidden" to anyone who visits the site and starts browsing. The only way you could get to those "hidden" pages is either by knowing the URLS that I used to use or if for some reason one of them is spidering in Google. Since I'm trying to clean up the site and get rid of any duplicate content issues, would i be better served by adding those "hidden" pages that don't have much or any content to the Robots.txt file or should i just De-activate them so now even if you type the old URL you will get a 404 page... In this case, are 404 pages bad? You're typically not going to find those pages in the SERPS so the only way you'd land on these 404 pages is to know the old url i was using that has been disabled. Please let me know if you guys think i should be 404'ing them or adding them to Robots.txt Thanks
Technical SEO | | Prime850 -
I have custom 404 page and getting so much 404 error on Google webmaster, what should i do?
I have a custom 404 page with popular post and category links in the page, everyday i have 404 crawl error on webmaster tools, what should i do?
Technical SEO | | rimon56930 -
Setting a geographic target in webmaster tools
If a site is targeting traffic from around the world should I set the geographic targeting in webmaster tools under 'settings' or leave it? Any help would be much appreciated!
Technical SEO | | SamCUK0 -
Whats the best tools for site architecture
Look for tools that can visualise a sites architecture (idealy automated). Also looking for tools that can visualise internal linking sturures
Technical SEO | | Motionlab0