Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
-
Hi Everyone,
I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file?
robots.txt
This file is to prevent the crawling and indexing of certain parts
of your site by web crawlers and spiders run by sites like Yahoo!
and Google. By telling these "robots" where not to go on your site,
you save bandwidth and server resources.
This file will be ignored unless it is at the root of your host:
Used: http://example.com/robots.txt
Ignored: http://example.com/site/robots.txt
For more information about the robots.txt standard, see:
http://www.robotstxt.org/wc/robots.html
For syntax checking, see:
http://www.sxw.org.uk/computing/robots/check.html
Website Sitemap
Sitemap: http://www.bestpricenutrition.com/sitemap.xml
Crawlers Setup
User-agent: *
Allowable Index
Allow: /*?p=
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/Directories
Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/Paths (clean URLs)
Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/Files
Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txtPaths (no clean URLs)
Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=25 -
Thanks again for the response. Looks like it just took a little more time for Google to resolve the issue. No more errors. Didn't do anything but resubmit Sitemap and Robots.txt.
Thanks for the tips as well. I am going to post one more question in another thread.
-
Jeff,
I was only able to find only ONE URL in the sitemap that is blocked by the robots.txt that you've posted in this question.
Check the image attached.
The URL is: https://www.bestpricenutrition.com/catalog/product/view/id/15650.htmlWhat did I do? A manual search of all the disallowed terms in the sitemap.
Also, you might want to take a comprehensive read at this article about robots.txt. It helped me to find that mistake.
The complete guide to Robots.txt - Portent.comBest Luck.
GR. -
Thanks for the quick response.
-
Yes...Google Webmaster Tools is giving examples...and they are basically all the product pages.
-
Did the Add Site under Google Webmaster Tools yes...this is from that new 'account'.
-
Yes...we are fixing that.
You see anything in that robots.text above that would indicate we are blocking https product pages?
-
-
Hello Jeff,
Just some routine questions to establish a base line:
- Have you checked that the sitemap doesnt include any of the disallowed URLs?
- You said that there was a movement to HTTPS, have you created a new account for the new domain?
- Im seing that the robots.txt has the old URL for the sitemap, without the HTTPS correction.
Let me know.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the best tool for getting a SiteMap url for a website with over 4k pages?
I have just migrated my website from HUGO to Wordpress and I want to submit the sitemap to Google Search Console (because I haven't done so in a couple years). It looks like there are many tools for getting a sitemap file built. But I think they probably vary in quality. Especially considering the size of my site.
Technical SEO | | DanKellyCockroach2 -
Clarification regarding robots.txt protocol
Hi,
Technical SEO | | nlogix
I have a website , and having 1000 above url and all the url already got indexed in Google . Now am going to stop all the available services in my website and removed all the landing pages from website. Now only home page available . So i need to remove all the indexed urls from Google . I have already used robots txt protocol for removing url. i guess it is not a good method for adding bulk amount of urls (nearly 1000) in robots.txt . So just wanted to know is there any other method for removing indexed urls.
Please advice.0 -
Will a robots.txt disallow apply to a 301ed URL?
Hi there, I have a robots.txt query which I haven't tried before and as we're nearing a big time for sales, I'm hesitant to just roll out to live! Say for example, in my robots.txt I disallow the URL 'example1.html'. In reality, 'example1.html' 301s/302s to 'example2.html'. Would the robots.txt directive also apply to 'example2.html' (disallow) or as it's a separate URL, would the directive be ignored as it's not valid? I have a feeling that as it's a separate URL, the robots disallow directive won't apply. However, just thought I'd sense-check with the community.
Technical SEO | | ecommercebc0 -
What do I need to do for HTTPS switch in Webmaster Tools?
My site is currently verified using a meta tag for both Google and Bing. Will I need to recreate the meta tag or will I be able to use the same one?
Technical SEO | | EcommerceSite1 -
Google Webmaster Tools : no data available
Hi guys I have a website which is 2 years old. Since 03/01/2013 I have no data in Google Webmaster Tools > Trafic > Search queries. The queries, the impressions and the clics dropped suddenly from one day to the next. I checked the rank of my keywords and the traffic of my site. They are stable and didn't move which means that they don't cause the problem. Has anybody had the same problem ? Is it Google Webmaster Tools bug ? Many thanks.
Technical SEO | | PFX1110 -
URL Error "NODE"
Hey guys, So I crawled my site after fixing a few issues, but for some reason I'm getting this strange node error that goes www.url.com/node/35801 which I haven't seen before. It appears to originate from user submitted content and when I go to the page it's a YouTube video with no video playing just a black blank screen. Has anyone had this issue before. I think it can probably just be taken off the site, but if it's a programming error of some sort I'd just like to know what it is to avoid it in the future. Thanks
Technical SEO | | KateGMaker0 -
"/" at the end of a URL
I just noticed that I have the exact same page showing up separately in my Google Analytics reports. One has a "/" at the end and the other does not. Otherwise, these are the exact same URL's. Is this something I need to be aware of from a duplicate content perspective? If so, how do I go about fixing this? I thought the SE's would automatically see that a URL with a "/" at the end is the same as one without, but if that is the case, why is it showing up in my reports as two separate pages?
Technical SEO | | Blockinc0