Robots.txt for subdomain
-
Hi there Mozzers!
I have a subdomain with duplicate content and I'd like to remove these pages from the mighty Google index. The problem is: the website is build in Drupal and this subdomain does not have it's own robots.txt.
So I want to ask you how to disallow and noindex this subdomain. Is it possible to add this to the root robots.txt:
User-agent: *
Disallow: /subdomain.root.nl/User-agent: Googlebot
Noindex: /subdomain.root.nl/Thank you in advance!
Partouter
-
Robots.txt work only for subdomain where it placed.
You need to create separate robots.txt for each sub-domain, Drupal allow this.
it must be located in the root directory of your subdomain Ex: /public_html/subdomain/ and can be accessed at http://subdomain.root.nl/robots.txt.
Add the following lines in the robots.txt file:
User-agent: *
Disallow: /
As alternative way you can use Robots <META> tag on each page, or use redirect to directory root.nl/subdomain and disallow it in main robots.txt. Personally i don't recommend it. -
Not sure how your server is configured but mine is set up so that subdomain.mydomain.com is a subdirectory like this:
http://www.mydomain.com/subdomain/
in robots.txt you would simply need to put
User-agent: *
Disallow: /subdomain/Others may have a better way though.
HTH
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What changes to subdomains - geo targeting - redirects - to boost our global ranking?
Hi, This is a complex question and I will really appreciate your help as this decision is essential to our business! We are taking a decision about what to do with our website in the terms of redirecting visitors based on their location. At the moment we have the following structure of the website: .com version practically only hosts our blog. The US traffic get presented with .com/us version automatically and the rest of the world gets presented with .com/uk. We are thinking to make .com version global (so no more .com/uk for all countries) in order to improve our ranking globally. At the moment we have good ranks in the UK, but not in the US and other countries. Our company is a global company now and we would like to get more exposure in other countries. We are also thinking to allow the option of geo redirects possible via a small pop-up window - visitors will be able to select their preferred country so next time they will be presented with their default option. To make it short: we would like to boost our domain without dropping ranks in the UK which we are now achieving through .com/uk version. Any ideas, recommendations or experience, I will appreciate! Thanks! Kat
Technical SEO | | Katarina-Borovska0 -
Robots.txt Download vs Cache
We made an update to the Robots.txt file this morning after the initial download of the robots.txt file. I then submitted the page through Fetch as Google bot to get the changes in asap. The cache time stamp on the page now shows Sep 27, 2013 15:35:28 GMT. I believe that would put the cache time stamp at about 6 hours ago. However the Blocked URLs tab in Google WMT shows the robots.txt last downloaded at 14 hours ago - and therefore it's showing the old file. This leads me to believe for the Robots.txt the cache date and the download time are independent. Is there anyway to get Google to recognize the new file other than waiting this out??
Technical SEO | | Rich_A0 -
Can I rely on just robots.txt
We have a test version of a clients web site on a separate server before it goes onto the live server. Some code from the test site has some how managed to get Google to index the test site which isn't great! Would simply adding a robots text file to the root of test simply blocking all be good enough or will i have to put the meta tags for no index and no follow etc on all pages on the test site also?
Technical SEO | | spiralsites0 -
Is a shorter subdomain better?
For example, consider the two subdomains below: learn.bluelinkerp.com learnmore.bluelinkerp.com Would there likely be an appreciable difference between the two based on length alone? Also, would it be better to use a term that relates to our product in general such as "software.bluelinkerp.com"? Thanks!
Technical SEO | | BlueLinkERP0 -
Should I block robots from URLs containing query strings?
I'm about to block off all URLs that have a query string using robots.txt. They're mostly URLs with coremetrics tags and other referrer info. I figured that search engines don't need to see these as they're always better off with the original URL. Might there be any downside to this that I need to consider? Appreciate your help / experiences on this one. Thanks Jenni
Technical SEO | | ShearingsGroup0 -
Subdomain for a blog
My client has a site hosted with a company that allows very little customization including I am unable to add a blog to the site. As he has a fair amount of time & money invested in the site, he is reluctant to start over. So my question is this. His blog is currently hosted off site, would it benefit him if I had them add a cname or a record to show his blog at blog.mydomain.com? Or does Google recognize that it is still a separate site and treat it as such? Finally does it matter how they set it up cname, a record or redirect? This is definitely not my area of expertise (if that is not already obvious from the question!). Thanks for your help! Matthew
Technical SEO | | farlandlee0 -
Robots.txt question
I want to block spiders from specific specific part of website (say abc folder). In robots.txt, i have to write - User-agent: * Disallow: /abc/ Shall i have to insert the last slash. or will this do User-agent: * Disallow: /abc
Technical SEO | | seoug_20050 -
Subdomains at Yola, Blogger, Wordpress
If the purpose of constructing a site or blog is for SEO ie a linking microsite, is it better to keep as a subdomain or to register on its own domain. The question is how much of the Domain Authority of that site will flow through the subdomain to linked site. I note that these subdomains have PA of 1, does this answer my own question?? Thanks eg widgets.yolasite.com or widgets.wordpress.com
Technical SEO | | seanmccauley0