Can I Block https URLs using Host directive in robots.txt?
-
Hello Moz Community,
Recently, I have found that Google bots has started crawling HTTPs urls of my website which is increasing the number of duplicate pages at our website.
Instead of creating a separate robots.txt file for https version of my website, can I use Host directive in the robots.txt to suggest Google bots which is the original version of the website.
Host: http://www.example.com
I was wondering if this method will work and suggest Google bots that HTTPs URLs are the mirror of this website.
Thanks for all of the great responses!
Regards,
Ramendra -
Hi Ramendra,
To my knowledge, you can only provide directives in the robots.txt file for the domain on which it lives. This goes for both http/https and www/non-www versions of domains. This is why it's important to handle all preferred domain formatting with redirects, that point to your canonicalized version. So if you want http://www to index, all other versions redirect to that.
There might be a work around of some sort, but honestly, what I described above with redirection towards preferred versions is the direction you should take. Then you can manage one robots.txt file and your indexing will align with what you want better.
-
Thanks Logan,
I have read somewhere that using Host directive in the robots.txt file we can suggest Google bots which is the original version of the website if there are number of mirror sites. So, I was wondering if we can prevent indexing/crawling of HTTPS URLs by using Host directive in robots.txt of HTTP site.
We are using an ecommerce SAAS platform for our website where we have only one robots.txt file that we can use for HTTP site.
Is there any other way to prevent indexing/crawling of HTTPS URLs?
Regards,
Ramendra -
Hi Ramendra,
Based on what you said, it sounds like both versions of your site exist and are indexed, and you want to mitigate your duplicate content risk. If that's accurate, here are my recommendations on this:
- Robots.txt cannot be used on a HTTP site to prevent indexing/crawling of HTTPS URLs
- Google crawls HTTPS by default, so if your site is fully secure, then you need to redirect (this can be done with a redirect rule in HTACCESS, you don't need to do one-to-one redirects) HTTP URLs over to their HTTPS twin
- In addition to your HTTP>HTTPS redirects, you should also use canonical tags to push your preferred version to search engines
- Your HTTPS site should have its own robots.txt file
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can you use Screaming Frog to find all instances of relative or absolute linking?
My client wants to pull every instance of an absolute URL on their site so that they can update them for an upcoming migration to HTTPS (the majority of the site uses relative linking). Is there a way to use the extraction tool in Screaming Frog to crawl one page at a time and extract every occurrence of _href="http://" _? I have gone back and forth between using an x-path extractor as well as a regex and have had no luck with either. Ex. X-path: //*[starts-with(@href, “http://”)][1] Ex. Regex: href=\”//
Technical SEO | | Merkle-Impaqt0 -
Can I use high ranking sites to push my competitors out of the first page of search results?
I'm looking at a bunch of long tail low traffic keywords that aren't difficult to rank for. As I was idly doing a boring task my mind wandered and I thought.... Why don't I ask lots of questions about these keywords on sites such as Moz, Quora, Reddit etc where the high DA will get them to rank for the search term? The results on a SEO site or Q&A site won't be relevant and so I'd starve my competitors of some of their leads. Of course I'm not sure the effort would be worth it but would it work? (and no, none of my long tail keywords are included in this post)
Technical SEO | | Zippy-Bungle3 -
The use of robots.txt
Could someone please confirm that if I do not want to block any pages from my URL, then I do not need a robots.txt file on my site? Thanks
Technical SEO | | ICON_Malta0 -
Can I rely on just robots.txt
We have a test version of a clients web site on a separate server before it goes onto the live server. Some code from the test site has some how managed to get Google to index the test site which isn't great! Would simply adding a robots text file to the root of test simply blocking all be good enough or will i have to put the meta tags for no index and no follow etc on all pages on the test site also?
Technical SEO | | spiralsites0 -
Warnings for blocked by blocked by meta-robots/meta robots Nofollow...how to resolve?
Hello, I see hundreds of notices for blocked by meta-robots/meta robots nofollow and it appears it is linked to the comments on my site which I assume I would not want to be crawled. Is this the case and these notices are actually a positive thing? Please advise how to clear them up if these notices can be potentially harmful for my SEO. Thanks, Talia
Technical SEO | | M80Marketing0 -
Can I redirect a URL that has a # in it? How?
Hi there - My web developer is saying that I can't do a URL redirect with a "#" in it. Currently, the URL is actually an anchored link within a page (which the URL indicates with a #). I want to change the content to a new URL, but our website links internally to the old URL, so we would need to do a URL redirect (assume 301). Can you tell me if this is possible and how? Thanks!
Technical SEO | | sfecommerce0 -
How to make multiple url redirection using global.asax in IIS 6?
sir, I am working with IIS 6 site and i ant to redirect three different urls of a domain to one url, i.e, there are the different versions of the same url...so how can i create one? I have found a script on google. but it says redirecting one url. see it here: Sub Application_BeginRequest(ByVal sender as Object, ByVal e as EventArgs) Try Dim requestedDomain As String = HttpContext.Current.Request.Url.ToString().toLower() If InStr(requestedDomain, "http://yoursite.com") Then requestedDomain = requestedDomain.Replace("http://yoursite.com", "http://www.yoursite.com") Response.Clear() Response.Status = "301 Moved Permanently" Response.AddHeader("Location", requestedDomain) Response.End() End If Catch ex As Exception Response.Write("Error in Global.asax :" & ex.Message) End Try End Sub
Technical SEO | | VipinLouka780