Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Block a sub-domain from being indexed
-
This is a pretty quick and simple (i'm hoping) question. What is the best way to completely block a sub domain from getting indexed from all search engines?
One item i cannot use is the meta "no follow" tag.
Thanks! - Kyle
-
Keep in mind that Google Index's everything that it can crawl. Even if you put a block in the robots.txt they will probably crawl it. You can require a password to that subdomain and keep big G out. This is easy to do if you have a site with cpanel access. Just go to manage permissions, and password protect that director with a .htaccess pw.
-
The robots.txt file just tells the bots you would "prefer" they don't index but there is nothing to prevent them from indexing.The only sure way to do this is to restrict access to the sub-domain for everyone and require some sort of authentication. If they don't have access they can't index.
-
In subdomain.example.com/robots.txt add the statements:
User-agent: *
Disallow: /Warning: Be absolutely certain that the above statements are not included in your example.com/robots.txt file or you'll kill your site.
-
Each subdomain may have its own robots.txt file. So for that subdomain, you can put:
User-agent: * Disallow: /
In the robots.txt, and that should do it.
Please note that disallowing pages in robots.txt will not necessarily mean they won't appear on search result pages.... if people link to pages that are disallowed on that subdomain, they can still appear in SERPs. I had this happen with a few pages, which leads to funny listings in the SERPs because Google has to guess what the page title and description of the page should be, since it's not allowed to read the page. The meta noindex tag is the way to go if you want to be really sure the page doesn't appear in the SERPs. If you use that, don't disallow the page. Here's a recent SEOMoz post about it: http://www.seomoz.org/blog/robot-access-indexation-restriction-techniques-avoiding-conflicts
-
That was going to be my assumption but i wasn't 100% sure how they worked with sub domains. Are you able to supply a little more information on implementation? It is extremely important that it only blocks: sub.domain.com and not domain.com
-
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Not all images indexed in Google
Hi all, Recently, got an unusual issue with images in Google index. We have more than 1,500 images in our sitemap, but according to Search Console only 273 of those are indexed. If I check Google image search directly, I find more images in index, but still not all of them. For example this post has 28 images and only 17 are indexed in Google image. This is happening to other posts as well. Checked all possible reasons (missing alt, image as background, file size, fetch and render in Search Console), but none of these are relevant in our case. So, everything looks fine, but not all images are in index. Any ideas on this issue? Your feedback is much appreciated, thanks
Technical SEO | | flo_seo1 -
Are images stored in Amazon S3 buckets indexable to your domain?
We're storing all our images in S3 bucket, common practice, but we want to get these images to drive traffic back to our site -- and credit for that traffic. We've configured the URLs to be s3.owler.com/<image_name>/<image_id>. I've not seen any of these images show in our web master tools. I am wondering if we're actually not going to get the credit for these images because technically they do sit on another domain. </image_id></image_name>
Technical SEO | | mindofmiller0 -
Clients domain expired - rankings lost - repurchased domain - what next?
Its only been 10 days and i have repurchased the domain name/ renewed. The who is info, website and contact information is all still the same. However we have lost all rankings and i am hoping that our top rankings come back. Does anyone have experience with such a crappy situation?
Technical SEO | | waqid0 -
Are .clinic domains effective?
We acquired a .clinic domain for a client, they are right now running under a .ca and I was just wondering if there were any cons to making the switch. On the flip side are there any pros? I've tried to search for the answer but couldn't seem to come across anything, thank you if you have any knowledge or could point me to a resource.
Technical SEO | | webignite0 -
Correct linking to the /index of a site and subfolders: what's the best practice? link to: domain.com/ or domain.com/index.html ?
Dear all, starting with my .htaccess file: RewriteEngine On
Technical SEO | | inlinear
RewriteCond %{HTTP_HOST} ^www.inlinear.com$ [NC]
RewriteRule ^(.*)$ http://inlinear.com/$1 [R=301,L] RewriteCond %{THE_REQUEST} ^./index.html
RewriteRule ^(.)index.html$ http://inlinear.com/ [R=301,L] 1. I redirect all URL-requests with www. to the non www-version...
2. all requests with "index.html" will be redirected to "domain.com/" My questions are: A) When linking from a page to my frontpage (home) the best practice is?: "http://domain.com/" the best and NOT: "http://domain.com/index.php" B) When linking to the index of a subfolder "http://domain.com/products/index.php" I should link also to: "http://domain.com/products/" and not put also the index.php..., right? C) When I define the canonical ULR, should I also define it just: "http://domain.com/products/" or in this case I should link to the definite file: "http://domain.com/products**/index.php**" Is A) B) the best practice? and C) ? Thanks for all replies! 🙂
Holger0 -
Umlaut in domain
Hi, My client wants to expand it's business to Germany and logically we need a domain name to match. We've found a great one and regsiterd several variants to it. However I just found out that in Germany it is possible (while here it's not) to register a domain with an umlaut. My question is: will google assign more value to: schädlinge.de than schadlinge.de when users search for schädlinge? If yes, how large will the difference be? (I will use an umlaut in the title etc) Kind regards,
Technical SEO | | media-surfer
Jason.0 -
How do I check if my IP is blocked?
We changed servers and where our sites once ranked very highly (page 1 for all sites), they now are nowhere to be seen. Someone suggested that our IP might be blocked. Someone else suggested SEOMoz was the place to go to get it checked. Any help would be GREATLY appreciated. With thanks. Bryan
Technical SEO | | FortressLearning0 -
What is the best method to block a sub-domain, e.g. staging.domain.com/ from getting indexed?
Now that Google considers subdomains as part of the TLD I'm a little leery of testing robots.txt with something like: staging.domain.com
Technical SEO | | fthead9
User-agent: *
Disallow: / in fear it might get the www.domain.com blocked as well. Has anyone had any success using robots.txt to block sub-domains? I know I could add a meta robots tag to the staging.domain.com pages but that would require a lot more work.0