Struggling with Google Bot Blocks - Please help!
-
I own a site called www.wheretobuybeauty.com.au
After months and months we still have a serious issue with all pages having blocked URLs according to Google Webmaster Tools.
The 404 errors are returning a 200 header code according to the email below. Do you agree that the 404.php code should be changed? Can you do that please ?
The current state:
Google webmaster tools Index Status shows:
26,000 pages indexed
44,000 pages blocked by robots.
In late March, we implemented a change recommended by an SEO expert and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned.
The no of pages that are displayed in a google search request of www.google.com.au where the query was ‘site:wheretobuybeauty.com.au’ is 37,000:
This new site has been re-crawled over last 4 weeks.
About the site
This is a Linux php site and has the following:
55,000 URLs in sitemap.xml submitted successfully to webmaster tools
robots.txt file has been modified several times:
Firstly we had none
Then we created one but were advised that it needed to have this current content:
User-agent: *
Disallow:
-
No problem my friend. You are most welcome and here at Moz, you will not only be able to get almost all your SEO related queries addressed and solved, you will also learn a great deal about digital marketing. I highly recommend to every aspiring digital marketer to be active on a community like Moz and I bet they will be able to save a great deal of time and money as well. Wish you all the very best.
Regards,
Devanur Rafi.
-
Thanks Devanur - trying out everything you have suggested.
-
Hi Alex,
Sorry, if I were not clear in my previous post. I meant that in general pages with cleaner code will have an edge over similar pages with bad code when it comes to SEO.
Just an example: Page A has cleaner code compared to page B with all other SEO factors being equal. In a scenario like this, page B might not be favored by Google because of issues arising from bad code like page loading performance, poor rendering in browsers etc,.
The issue at hand might not be because your pages do not pass W3 Validation but its not a bad idea to have a cleaner code on your website
Best regards,
Devanur Rafi.
-
Hi Devanur
My understanding is that Google does not have a problem with invalid XHTML or pages that are not W3C accessible. Please see a comment on this at SEOMOZ:
-
Hi Alex,
I did a code validation check for the following URL:
It gave 238 Errors and 538 Warnings!!
Search engines like Google favor pages with cleaner code. So, I strongly recommend to have the code cleaned on the website.
Here you go for validation check:
Best regards,
Devanur Rafi.
-
Hi Alex,
If the underscores constitute only 4% of the total URLs, then this can be safely kept aside in purview of the current issue.
Same goes with the keyword repetition in the page titles and URLs. However, if it is possible for you to revisit your URL structure and have it like the following, you should go for it:
www.wheretobuybeauty.com.au/<brand< a=""> name>/<product name="">, e.g.</product></brand<>
http://www.wheretobuybeauty.com.au/floris/royal-arms-diamond-edition-eau-de-parfum-spray-100ml-34oz
Same thing with the Page titles also.
Now we are left with two things, the page performance and URL canonicalization. Please have them fixed as early as possible.
Also, I checked your IP address and you have gone for a shared hosting. This is not at all recommended if you are a serious online business owner. Your IP, 103.9.170.75 is being shared by at least 250 other domains that include some bad websites.
Though there are different views about IP bad neighborhood and its impact on SEO, I have always been an advocate of clean IP and recommended it to all my clients always. You can go in for a dedicated IP which is very cheap these days and better yet if you go for a VPS.
For more about this, please check out the "Oops, your IP is either dirty or virtual" section on the following page:
http://www.bruceclay.com/in/seo-tech-tips/techtips.htm
And also, this section, "A Strong Foundation for Your Site to Operate On" on the following page:
http://www.bruceclay.com/blog/2011/04/the-seo-bucket-list-3-things-to-do-before-your-site-dies/
Lastly, I checked your domain's DNS health and here you go for the results:
http://intodns.com/wheretobuybeauty.com.au
Though these might not be causing the current issue, its good to sort everything as we should not leave any stone unturned in making our website a better one out there.
Best regards,
Devanur Rafi.
-
Hey Devanur
please see our responses below:
Hi Alex,
Thanks for the info. Here are few issues that I observed with the website and I am very confident that if you can address and fix these, you should come out of the issue with flying colors:
1. URL canonicalization issue: Both the www and non-www versions of your website URLs return an HTTP header status code 200. You should ideally make all the non-www URLs to be redirected to their respective www versions via a 301 permanent redirection immediately.
**Response: We are asking the developer to correct this. **
2. Inconsistent URL structure: Your website is still using 'underscrores (_) in the URLs as word separators. There are underscores along with the recommended hyphens (-). This inconsistent usage can sometimes lead to issues. So please replace all the underscores with hyphens.
Response: This problem only occurs in a few pages where special characters have been replaced with underscores – probably in 4% of product pages. I can’t see that this has an impact on the SEO?
3. Google PageSpeed check: When I ran Google PageSpeed test on some of the URLs from your website along with the ones that you gave, I found the score varying between, 28 and 60. Please look at the recommendations that the PageSpeed tool gives and try to address the issues (especially the ones like, "Reduce blocking resources". For more: https://developers.google.com/speed/docs/best-practices/rtt#PreferAsyncResources)
I suggest you to please run Google PageSpeed check for some of the URLs.
Note: The URLs from your website that are present in the Google's index may also give similar issues when run through PageSpeed test. This should not make you not addressing these issues.
Response: We will ask the developers to improve performance specifically with the highest value things that are showing up in Google PageSpeed check.
4. Heavy pages leading to higher page loading times and response times:
Many of the pages that I checked are more than 1.3 MB in size which is very huge.This can be a really big problem most of the times that will not only impacts your website from search engines' perspective but also leads to bad user experience which ultimately affects the SEO of your website. You can use tools like gtmetrix.com and fix the issues shown by them.
Response: We will ask the developers to improve performance specifically with the highest value things that are showing up in gtmetrix.com suggestions.
5. Repetition of keywords or phrases in page titles and URLs:
This issue might look like an over optimization effort and should be fixed as early as possible.
For example: www.wheretobuybeauty.com.au/acqua-di-parma/acqua-di-parma-acqua-di-parma-collezione-barbiere-shaving-cream-75ml_25oz
If you look at the above page, the phrase, 'acqua-di-parma' is present twice in both the URL and page title. This is something that you need to review seriously as it looks like keyword repetition that is not good from an SEO stand point.
Response: This occurs with approx 300 product pages out of 40,000 so a very small percentage. We will clean this up when we update our data. I can’t see that this has any impact on SEO considering the small no? Note however that every product page is constructed as follows:
http://www.wheretobuybeauty.com.au/floris/floris-royal-arms-diamond-edition-eau-de-parfum-spray-100ml_34oz
Is there some risk that this will look like over optimisation?
By the way, your robots.txt file is clean and it should not be causing these issues.
Please have the issues mentioned above as soon as possible and you should be out of the woods soon after that.
I wish you good luck Alex.
Best regards,
Devanur Rafi.
-
Hi Alex,
Thanks for the info. Here are few issues that I observed with the website and I am very confident that if you can address and fix these, you should come out of the issue with flying colors:
1. URL canonicalization issue: Both the www and non-www versions of your website URLs return an HTTP header status code 200. You should ideally make all the non-www URLs to be redirected to their respective www versions via a 301 permanent redirection immediately.
2. Inconsistent URL structure: Your website is still using 'underscrores (_) in the URLs as word separators. There are underscores along with the recommended hyphens (-). This inconsistent usage can sometimes lead to issues. So please replace all the underscores with hyphens.
3. Google PageSpeed check: When I ran Google PageSpeed test on some of the URLs from your website along with the ones that you gave, I found the score varying between, 28 and 60. Please look at the recommendations that the PageSpeed tool gives and try to address the issues (especially the ones like, "Reduce blocking resources". For more: https://developers.google.com/speed/docs/best-practices/rtt#PreferAsyncResources)
I suggest you to please run Google PageSpeed check for some of the URLs.
Note: The URLs from your website that are present in the Google's index may also give similar issues when run through PageSpeed test. This should not make you not addressing these issues.
4. Heavy pages leading to higher page loading times and response times:
Many of the pages that I checked are more than 1.3 MB in size which is very huge.This can be a really big problem most of the times that not only impacts your website from search engines' perspective but also leads to bad user experience which ultimately affects the SEO of your website. You can use tools like gtmetrix.com and fix the issues shown by them.
5. Repetition of keywords or phrases in page titles and URLs:
This issue might look like an over optimization effort and should be fixed as early as possible.
For example: www.wheretobuybeauty.com.au/acqua-di-parma/acqua-di-parma-acqua-di-parma-collezione-barbiere-shaving-cream-75ml_25oz
It could have been like: www.wheretobuybeauty.com.au/acqua-di-parma/collezione-barbiere-shaving-cream-75ml-25oz
If you look at the above page, the phrase, 'acqua-di-parma' is present twice in both the URL and page title. This is something that you need to review seriously as it looks like keyword repetition that is not good from an SEO stand point.
By the way, your robots.txt file is clean and it should not be causing these issues.
Please have the issues mentioned above as soon as possible and you should be out of the woods soon after that.
I wish you good luck Alex.
Best regards,
Devanur Rafi.
-
Thanks Devanur
I put this to my partners and he said he is addressing it but that the main issue still remains.
This is the critical issue where there are only a few pages visible to google search as almost all are blocked by the google bot. I am re-stating the problem in this email for you.
Can you please take a look at the whole problem and see if you can see what is causing this.
Is robots.txt causing this? It is the only change that we have made and at one point the problem was corrected but has now returned. I have read everything that I can about robots.txt on the google site and in forums.
Here are two examples (out of 44,000) that are blocked. It is easy to find other examples – simply test any of the product pages – only 200 out of 44,000 return any result.
Try searching using www.google.com.au and using the search query
Abercrombie & Fitch 1892 Cobalt Eau De Cologne Spray 50ml/1.7oz site:wheretobuybeauty.com.au
Second example:
Try searching using:
Acqua Di Parma Collezione Barbiere Shaving Cream 75ml/2.5oz site:wheretobuybeauty.com.au
The current state:
Google webmaster tools Index Status shows:
26,000 pages indexed
44,000 pages blocked by robots.
In late March, we implemented a change recommended by an SEO expert Harmeen and he provided a new robots.txt file, advised that we should amend sitemap.xml and other changes. We implemented those changes and then setup a re-index of the site by google. The no of blocked URLs eventually reduced in May and June to 1,000 for a few days – but now the problem has rapidly returned.
This new site has been re-crawled over last 4 weeks.
About the site
55,000 URLs in sitemap.xml submitted successfully to webmaster tools
robots.txt file has been modified several times:
Firstly we had none, then we created one but were advised that it needed to have this current content:
“User-agent: *
Disallow:
Sitemap: http://www.wheretobuybeauty.com.au/sitemap.xml”
I put this into robots.txt but was then advised yesterday that there should be no blank line between these lines and I removed them yesterday.
-
Hi Alex,
Without diving in to the issue of increased number of 404 errors being reported by Webmaster tools account, let us first look at the core issue where, 404 pages (non-existing resources) that return an HTTP header status code 200. These are called, 'soft 404 errors'. Ideally all the non-existing resources on the website should return an HTTP header status code 404 or 410 as per the situation and not a status 200 which is very confusing for search engines and a bad practice. This should be fixed immediately. Please have all such pages return 404 and not 200 as soon as possible.
Here you go for more about the soft 404 errors:
https://support.google.com/webmasters/answer/181708?hl=en
and here to know more about when to return a 404 status code:
https://support.google.com/webmasters/answer/2409439?hl=en
Best regards,
Devanur Rafi.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Tens of duplicate homepages indexed and blocked later: How to remove from Google cache?
Hi community, Due to some WP plugin issue, many homepages indexed in Google with anonymous URLs. We blocked them later. Still they are in SERP. I wonder whether these are causing some trouble to our website, especially as our exact homepages indexed. How to remove these pages from Google cache? Is that the right approach? Thanks
Algorithm Updates | | vtmoz0 -
Using Google to find a discontinued product.
Hi Guys. I mostly use this forum for business questions, but now it's a personal one! I'm trying to find a supplier that might still have discontinued product. It's the Behritone C5A speaker monitor. All my searches bring up a plethora of pages that appear to sell the product... but they have no stock. (Wouldn't removing these pages make for a better internet?) No 2nd hand ones on eBay 😞 Do you have any suggestion about how I can get more relevant results... i.e find supplier that might still have stock? Any tips or trick I may be able to use to help me with this? Many thanks in advance to an awesome community 🙂 Isaac.
Algorithm Updates | | isaac6631 -
What are top 3 directives to prepare for a Google algorithm update?
Company's site fluctuated in keyword rankings last Friday, due to Unnamed algorithm. Our directives are on-page optimization and continual content generation. What are other directives to take?
Algorithm Updates | | ejcruz0 -
Same Meta description is being shown on Google?
Not sure why this is happening but when you this command into Google site:"mywebsite": + "key phrase" It brings up pages from my website which have the key phrase but I have noticed that Google is using the wrong meta description for all of them even though these pages all have their own unique meta description Does anyone know why this would be happening? Thanks
Algorithm Updates | | webguru20140 -
Why are Google Webmaster Tools' Google rankings different to actual Google rankings?
Dear Moz, We have noticed that according to Google Webmaster Tools one of our client sites is ranking very prominently for some of the major key phrases that we are trying to rank them for. However, when we perform a Google search for these queries, our client's content is nowhere to be seen, not even on the 5th page (we logged out of the Google account before performing the test). A long-term manual spam action on our client's site was recently lifted by Google - is it possible that Google Webmaster Tools is providing data about our client's estimated Google rankings, without taking into consideration the penalty of the manual spam action which was taken? Thanks
Algorithm Updates | | BoomDialogue690 -
Lost over 65% of organic visits since Sept - Please help
Hi all, I'm fairly new to SEOmoz, i am here because i dont seem to be getting any actuall help from my SEO company, so am trying to figure this out myself. I have done a quick analysis using google analytics and have gone from 13441 google organic visitors in Sept to 4527 google organic visitors in March. (see attached image GA Sept - March) Visitor numbers seems be to fairly stable in sept, oct, nov, and perhaps a slight decline in Dec, but then on Jan17th seems a big drop, and i have never recovered... (See images GA - Oct, GA - Nov, GA - Dec and GA - Jan ) I am at a bit of a loss, i have heard about google penalties, but there are no warnings in my webmaster tools about this, the only warning i ever got was a message on Dec 24th which read : http://XXX.co.uk/: Big traffic change for top URL Search results clicks for (my site) have decreased significantly. The number of clicks that your site receives from Google can change from day to day for a variety of factors, including automatic algorithm updates. However, if you have recently made significant changes to the content or configuration of your site, this change may be an indication that there are problems. This is not a penalty notice is it? its simply google telling me i have lost a lot of traffic? Strange thing is December's traffic was pretty much fine, so do not understand why i got this message, when the drop in traffic happened a month later. I've been racking my brain for months trying to fix every little possible issue i can find with my site (its not nice, when you do not actually know what google has decided it doesn't like about your site!) but nothing seems to be helping, i've been hiring content writter's as i found loads of websites have copied a lot of content, i also decided maybe the product descriptions are being classed as duplicate so again have got in content writter's to re-write hundreds of product descriptions (all this is costing me an arm and a leg)... Nada! Then today, when doing this google analysis (see image GA Sept - March) i noticed two sites, i have renamed them on the image to: XXX-1.co.uk & XXX-2.ecomm-search.com XXX-1.co.uk was a test server where i could play around with code on my website before i actually implement changes, usually i delete all the files after i use it... but it looks like i forgot to do that.... the site is a complete copy of my website (but obviously a version where you can not actually process an order) but all the pages still have canonical links back to the proper website... Just in case this was causing issues, i have 301 redirected the site back to my main site... Is this wise? or should i just delete the site? XXX-2.ecomm-search.com is a search company i was trialing as i wanted to improve the search functionality on my own site, however the search features are hosted on their site... I was told it has no bearing on SEO as its not indexed... however if google is seeing refferalls coming from that site, which is basically a duplicate of my search pages but with better functionality, could they be considering this a duplicate content issue? If anybody can give me any advice at all about above questions, or in general about what happened on Jan 17th (as i see from web search's many people were affected but i can not seem to find people who actually know what google did) i would be very grateful. Thanks James iNiKRDq jd6FJ3l RzaqTEy 8KCtykz 2m2OGvG
Algorithm Updates | | isntworkdull0 -
Google Authorship and Hobby Blog
I hope that someone can help me come up with the best option. Please forgive my ignorance on this issue. I have a hobby blog and up until now I have not wanted to associate it with my real name. It is a menswear blog about classic American style. I was afraid that it may be a hindrance if I was ever looking for a more conservative career than SEO. I am now reconsidering this and thinking that claiming it may be of more help than harm. Which brings me to Google Authorship. My dilemma and misunderstanding stems from the fact that I have mutliple Gmail accounts. I am guessing that some of the newer accounts have a G+ associated with them. So my question is do I use the email that is associated with my blog or my main gmail that I use personally? If I do use the gmail associated with the blog will it then become my default Google plus profile? Any insight would be helpful. Thanks in advance. If any of you are interested the hobby blog is Oxford Cloth Button Down.
Algorithm Updates | | JerrodDavid0 -
What do you think of Google SERP encryption?
Really interesting post by Search Engine Land about this "issue" for tracking conversion, especially for long tail keyword research. I suppose this change will be also applied on all google search pages (.ca, .fr etc.). I Really don't think Webmaster tools is a serious compensation in Analytics for this.
Algorithm Updates | | Olivier_Lambert0