Crawl Errors In Webmaster Tools
-
Hi Guys,
Searched the web in an answer to the importance of crawl errors in Webmaster tools but keep coming up with different answers.
I have been working on a clients site for the last two months and (just completed one months of link bulding), however seems I have inherited issues I wasn't aware of from the previous guy that did the site.
The site is currently at page 6 for the keyphrase 'boiler spares' with a keyword rich domain and a good onpage plan. Over the last couple of weeks he has been as high as page 4, only to be pushed back to page 8 and now settled at page 6.
The only issue I can seem to find with the site in webmaster tools is crawl errors here are the stats:-
In sitemaps : 123
Not Found : 2,079
Restricted by robots.txt 1
Unreachable: 2
I have read that ecommerce sites can often give off false negatives in terms of crawl errors from Google, however, these not found crawl errors are being linked from pages within the site.
How have others solved the issue of crawl errors on ecommerce sites? could this be the reason for the bouncing round in the rankings or is it just a competitive niche and I need to be patient?
Kind Regards
Neil
-
That's a bit of a pain.
If anything tell them to set up a custom informative 404 error page that will at least direct them somewhere else and not bail from the site.
-
Hi Kieran,
Thank you so much for your answer. The issue is I don't have access to the site admin and can only suggest changes.
Ill suggest what you have put above, and push ahead with the link building and see what happens. Told the client its a competitive niche especially this time of year, my boiler always seems to pack in before winter, does seem to have a settled around page 6, so you may be right.
Just didn't want to rule out if this was a possible penalization from Google.
Kind Regards
Neil
-
Not all errors are bad but for instance if you are getting 404 errors and the actual pages do not exist anymore then you should do 301 redirects on your .htacess to point Google to the right page. Doing this will sort out a lot of the errors. But if you are getting 404s and the pages are there then that is a bigger problem
Restricted by robots is not an error as this is telling you that you don't want somewhere crawled. Check the file to see if it is correct in what it is restricting.
How often are you pushing your sitemap out to Google. If you are as active with pages as your posts suggest I would think about submitting a new one or automating its creation.
If you are actively working the content on the sties there can often be this level of SERPS bouncing.but if you continue with the content I wouldn't worry too much about the errors for the moment and do the housekeeping above. There are very few completely spotless sites out there and the Google Webmaser tools and even the SEOMOZ tools here will always give you some level of errors.
hope this helps
Kieran
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GMB Bulk Upload Error
Hello! I am continuing to have issues with the bulk upload option.Currently, there are 12 non-verified locations in a location group in my GMB account. I have approximately 6-8 more that need to be added to this group via bulk upload. When uploading the spreadsheet, I receive an error reading "You've exceeded the limit for the about of locations you can upload to Google My Business in a single day. Try again later." It seems to happen specifically to the locations that aren't in my GMB account already. The others, the ones already in the account, are fine and simply read "No updates" when the bulk upload sheet is read. Everything else is marked as an error. Why is it marking some listings as nonviable when they come in via the bulk verification spreadsheet, which has been downloaded directly from the links Google has provided, and filled in with the help of the sample and amenities list?How do we finish uploading all of the remaining locations?I have another group, separate group (same company, groups split into US and International) under my name that may also need a bulk upload - what can I do to avoid this error in the future? Can they still be bulk uploaded to my account after I upload the first location group's listings?If you could provide any guidance, I'd be very grateful.Thanks in advance!
Technical SEO | | kmarsh0 -
Quest about 404 Errors
About two months ago, we deleted some unnecessary pages on our website that were no longer relevant. However, MOZ is still saying that these deleted pages are returning 404 errors when a crawl test is done. The page is no longer there, at least that I can see. What is the best solution for this? I have a page that similar to the older page, so is it a good choice to just redirect the bad page to my good page? If so, what's the best way to do this. I found some useful information searching but none of it truly pertained to me. I went around my site to make sure there were no old links that directed traffic to the non existent page, and there are none.
Technical SEO | | Meier0 -
Google Webmaster Tools: MESSAGE
Dear site owner or webmaster of http://www.enakliyat.com.tr/,
Technical SEO | | iskq
Some of your site's pages may be using techniques that do not comply with Google's Webmaster Guidelines.
On your site, in particular, does not provide an adequate level of innovation in low-quality unique content or set of pages. Examples of this type of thin affiliate pages, pages, bridge pages, it will automatically be created or copied content. For more information about the unique and interesting content, visit http://www.google.com/support/webmasters/bin/answer.py?answer=66361.
We recommend you to make the necessary changes to your site to fit your site's quality guidelines. After making these changes, please submit your site for reconsideration in Google's search results.
If you have questions about how to resolve this problem, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality Team **After this massege ve find our low quality pages and we added this urls on Robots.txt. Other than that, what can we do? ** **Our site is a home to home moving listing portal. Consumers who wants to move his home fills a form so that moving companies can cote prices. We were generating listing page URL’s by using the title submitted by customer. **0 -
I have custom 404 page and getting so much 404 error on Google webmaster, what should i do?
I have a custom 404 page with popular post and category links in the page, everyday i have 404 crawl error on webmaster tools, what should i do?
Technical SEO | | rimon56930 -
Webmaster Tools/Time spent downloading a page
Hi! Is it preferable for the "time spent downloading a page" in Google webmaster tools to be high or low? I've noticed that this metric rapidly decreased after I moved my site to WP Engine and I'm trying to figure out if it's a good or bad thing. Thanks! Jodi QK8dp QK8dp
Technical SEO | | JodiFTM0 -
Duplicate Content Errors
Ok, old fat client developer new at SEO so I apologize if this is obvious. I have 4 errors in one of my campaigns. two are duplicate content and two are duplicate title. Here is the duplicate title error Rare Currency And Old Paper Money Values and Information.
Technical SEO | | Banknotes
http://www.antiquebanknotes.com/ Rare Currency And Old Paper Money Values and Information.
http://www.antiquebanknotes.com/Default.aspx So, my question is... What do I need to do to make this right? They are the same page. in my page load for default.aspx I have this: this.Title = "Rare Currency And Old Paper Money Values and Information."; And it occurs only once...0 -
Should we block URL param in Webmaster tools after URL migration?
Hi, We have just released a new version of our website that now has a human readable nice URL's. Our old ugly URL's are still accessible and cannot be blocked/redirected. These old URL's use a URL param that has an xpath like expression language to define the location in our catalog. We have about 2 million pages indexed with this old URL param in it while we have approximately 70k nice URL's after the migration. This high number of old URL's is due to facetting that was done using this URL param. I wonder if we should now completely block this URL param from Google Webmaster tools so that these ugly URL's will be removed from the Google index. Or will this harm our position in Google? Thanks, Chris
Technical SEO | | eCommerceSEO0 -
At what point is the canonical tag crawled
Do search engines (specifically Google) crawl the url in the canonical tag as it loads or do they load the whole page before crawling it? Thanks,
Technical SEO | | ao.com0