Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Block an entire subdomain with robots.txt?
- 
					
					
					
					
 Is it possible to block an entire subdomain with robots.txt? I write for a blog that has their root domain as well as a subdomain pointing to the exact same IP. Getting rid of the option is not an option so I'd like to explore other options to avoid duplicate content. Any ideas? 
- 
					
					
					
					
 Awesome! That did the trick -- thanks for your help. The site is no longer listed  
- 
					
					
					
					
 Fact is, the robots file alone will never work (the link has a good explanation why - short form: all it does is stop the bots from indexing again). Best to request removal then wait a few days. 
- 
					
					
					
					
 Yeah. As of yet, the site has not been de-indexed. We placed the conditional rule in htaccess and are getting different robots.txt files for the domain and subdomain -- so that works. But I've never done this before so I don't know how long it's supposed to take? I'll try to verify via Webmaster Tools to speed up the process. Thanks 
- 
					
					
					
					
 You should do a remove request in Google Webmaster Tools.  You have to first verify the sub-domain then request the removal. See this post on why the robots file alone won't work... http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts 
- 
					
					
					
					
 Awesome. We used your second idea and so far it looks like it is working exactly how we want. Thanks for the idea. Will report back to confirm that the subdomain has been de-indexed. 
- 
					
					
					
					
 Option 1 could come with a small performance hit if you have a lot of txt files being used on the server. There shouldn't be any negative side effects to option 2 if the rewrite is clean (IE not accidently a redirect) and the content of the two files are robots compliant. Good luck 
- 
					
					
					
					
 Thanks for the suggestion. I'll definitely have to do a bit more research into this one to make sure that it doesn't have any negative side effects before implementation 
- 
					
					
					
					
 We have a plugin right now that places canonical tags, but unfortunately, the canonical for the subdomain points to the subdomain. I'll look around to see if I can tweak the settings 
- 
					
					
					
					
 Sounds like (from other discussions) you may be stuck requiring a dynamic robot.txt file which detects what domain the bot is on and changes the content accordingly.  This means the server has to run all .txt file as (I presume) PHP. Or, you could conditionally rewrite the /robot.txt URL to a new file according to sub-domain RewriteEngine on 
 RewriteCond %{HTTP_HOST} ^subdomain.website.com$
 RewriteRule ^robotx.txt$ robots-subdomain.txtThen add: User-agent: * 
 Disallow: /to the robots-subdomain.txt file (untested) 
- 
					
					
					
					
 Placing canonical tags isn't an option? Â Detect that the page is being viewed through the subdomain, and if so, write the canonical tag on the page back to the root domain? Or, just place a canonical tag on every page pointing back to the root domain (so the subdomain and root domain pages would both have them). Â Apparently, it's ok to have a canonical tag on a page pointing to itself. Â I haven't tried this, but if Matt Cutts says it's ok... 
- 
					
					
					
					
 Hey Ryan, I wasn't directly involved with the decision to create the subdomain, but I'm told that it is necessary to create in order to bypass certain elements that were affecting the root domain. Nevertheless, it is a blog and the users now need to login to the subdomain in order to access the Wordpress backend to bypass those elements. Traffic for the site still goes to the root domain. 
- 
					
					
					
					
 They both point to the same location on the server? So there's not a different folder for the subdomain? If that's the case then I suggest adding a rule to your htaccess file to 301 the subdomain back to the main domain in exactly the same way people redirect from non-www to www or vice-versa. However, you should ask why the server is configured to have a duplicate subdomain? You might just edit your apache settings to get rid of that subdomain (usually done through a cpanel interface). Here is what your htaccess might look like: <ifmodule mod_rewrite.c="">RewriteEngine on 
 Â # Redirect non-www to wwww
 Â RewriteCond %{HTTP_HOST} !^www.mydomain.org [NC]
 Â RewriteRule ^(.*)$ http://www.mydomain.org/$1 [R=301,L]</ifmodule>
- 
					
					
					
					
 Not to me LOL  I think you'll need someone with a bit more expertise in this area than I to assist in this case. Kyle, I'm sorry I couldn't offer more assistance... but I don't want to tell you something if I'm not 100% sure. I suspect one of the many bright SEOmozer's will quickly come to the rescue on this one. I think you'll need someone with a bit more expertise in this area than I to assist in this case. Kyle, I'm sorry I couldn't offer more assistance... but I don't want to tell you something if I'm not 100% sure. I suspect one of the many bright SEOmozer's will quickly come to the rescue on this one.Andy  
- 
					
					
					
					
 Hey Andy, Herein lies the problem. Since the domain and subdomain point to the exact same place, they both utilize the same robots.txt file. Does that make sense? 
- 
					
					
					
					
 Hi Kyle  Yes, you can block an entire subdomain via robots.txt, however you'll need to create a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain's content. Yes, you can block an entire subdomain via robots.txt, however you'll need to create a robots.txt file and place it in the root of the subdomain, then add the code to direct the bots to stay away from the entire subdomain's content.User-agent: * 
 Disallow: /hope this helps  
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
- 
		
		Moz ToolsChat with the community about the Moz tools. 
- 
		
		SEO TacticsDiscuss the SEO process with fellow marketers 
- 
		
		CommunityDiscuss industry events, jobs, and news! 
- 
		
		Digital MarketingChat about tactics outside of SEO 
- 
		
		Research & TrendsDive into research and trends in the search industry. 
- 
		
		SupportConnect on product support and feature requests. 
Related Questions
- 
		
		
		
		
		
		How to create a smooth blog migration from subdomain to subfolder main?
 Hi mozzers, We have decided to migrate the blog subdomain to the domain's subfolder (blog.example.com to example.com/blog). To do this the most effective way and avoid impact SEO negatively I believe I have to follow this checklist: Create a list of all 301 redirects from blog.example.com/post-1 to example.com/post-1 Make sure title tags remain the same on main domain Make sure internal links remain the same Is there something else I am missing? Any other best practices? I also would like to have all blog post as AMPs. Any recommendations if this something we should do since we are not a media site? Any other tips on successfully implementing those types of pages? Thanks Intermediate & Advanced SEO | | Ty19861
- 
		
		
		
		
		
		How can I get Bing to index my subdomain correctly?
 Hi guys, My website exists on a subdomain (i.e. https://website.subdomain.com) and is being indexed correctly on all search engines except Bing and Duck Duck Go, which list 'https://www.website.subdomain.com'. Unfortunately my subdomain isn't configured for www (the domain is out of my control), so searchers are seeing a server error when clicking on my homepage in the SERPs. I have verified the site successfully in Bing Webmaster Tools, but it still shows up incorrectly. Does anyone have any advice on how I could fix this issue? Thank you! Intermediate & Advanced SEO | | cos20300
- 
		
		
		
		
		
		Should I be using meta robots tags on thank you pages with little content?
 I'm working on a website with hundreds of thank you pages, does it make sense to no follow, no index these pages since there's little content on them? I'm thinking this should save me some crawl budget overall but is there any risk in cutting out the internal links found on the thank you pages? (These are only standard site-wide footer and navigation links.) Thanks! Intermediate & Advanced SEO | | GSO0
- 
		
		
		
		
		
		Turning off a subdomain
 Hi! I'm currently working with http://www.muchbetteradventures.com/. They have a previous version of the site, http://v1.muchbetteradventures.com, as sub domain on their site. I've noticed a whole bunch of indexing issues which I think are caused by this. The v1 site has several thousand pages and ranks organically for a number of terms, but the pages are not relevant for the business at this time. The main site has just over 100 pages. More than 28,400 urls are currently indexed. We are considering turning off the v1 site and noindexing it. There are no real backlinks to it. The only worry is that by removing it, it will be seen as a massive drop in content. Rankings for the main site are currently quite poor, despite good content, a decent link profile and high domain authority. Any thoughts would be much appreciated! Intermediate & Advanced SEO | | Blink-SEO0
- 
		
		
		
		
		
		Subdomain Blog Sitemap link - Add it to regular domain?
 Example of setup: Intermediate & Advanced SEO | | EEE3
 www.fancydomain.com
 blog.fancydomain.com Because of certain limitations, I'm told we can't put our blogs at the subdirectory level, so we are hosting our blogs at the subdomain level (blog.fancydomain.com). I've been asked to incorporate the blog's sitemap link on the regular domain, or even in the regular domain's sitemap. 1. Putting the a link to blog.fancydomain.com/sitemap_index.xml in the www.fancydomain.com/sitemap.xml -- isn't this against sitemap.org protocol? 2. Is there even a reason to do this? We do have a link to the blog's home page from the www.fancydomain.com navigation, and the blog is set up with its sitemap and link to the sitemap in the footer. 3. What about just including a text link "Blog Sitemap" (linking to blog.fancydomain.com/sitemap_index.html) in the footer of the www.fancydomain.com (adjacent to the text link "Sitemap" which already exists for the www.fancydomain.com's sitemap. Just trying to make sense of this, and figure out why or if it should be done. Thanks!0
- 
		
		
		
		
		
		Duplicate content on subdomains.
 Hi Mozer's, I have a site www.xyz.com and also geo targeted sub domains www.uk.xyz.com, www.india.xyz.com and so on. All the sub domains have the content which is same as the content on the main domain that is www.xyz.com. So, I want to know how can i avoid content duplication. Many Thanks! Intermediate & Advanced SEO | | HiteshBharucha0
- 
		
		
		
		
		
		Create new subdomain or new site for new Niche Product?
 We have an existing large site with strong, relevant traffic, including excellent SEO traffic. The company wants to launch a new business offering, specifically targeted at the "small business" segment. Because the "small business" customer is substantially different from the traditional "large corporation" customer, the company has decided to create a completely independent microsite for the "small business" market. Purely from a Marketing and Communications standpoint, this makes sense. From an SEO perspective, we have 2 options: Create the new "small business" microsite on a subdomain of the existing site, and benefit from the strong domain authority and trust of the existing site. Build the microsite on a separate domain with exact primary keyword match in the domain name. My sense is that option #1 is by far the better option in the short and long run. Am I correct? Thanks in advance! Intermediate & Advanced SEO | | axelk0
- 
		
		
		
		
		
		Subdomains and SEO - Should we redirect to subfolder?
 A new client has mainsite.com and a large numer of city specific sub domains i.e. albany.mainsite.com. I think that these subdomains would actually work better as subfolders i.e mainsite.com/albany rather than albany.mainsite.com. The majority of links on the subdomains link to the main site anyway i.e. mainsite.com/contactus rather than albany.mainsite.com/contactus. Having mostly main domain links on a subdomain doesnt seem like clever link architecture to me and maybe even spammy. Im not overly familiar with redirecting subdomains to subfolders. If we go the route of 301'ing subdomains to subfolders any advice/warnings? Intermediate & Advanced SEO | | AndyMacLean0
 
			
		 
			
		 
			
		 
			
		 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				 
					
				