Blocking subdomains without blocking sites...
-
So let's say I am working for bloggingplatform.com, and people can create free sites through my tools and those sites show up as myblog.bloggingplatform.com. However that site can also be accessed from myblog.com.
Is there a way, separate from editing the myblog.com site code or files, for me to tell google to stop indexing myblog.bloggingplatform.com while still letting them index myblog.com without inserting any code into the page load?
This is a simplification of a problem I am running across.
Basically, Google is associating subdomains to my domain that it shouldn't even index, and it is adversely affecting my main domain. Other than contacting the offending sub-domain holders (which we do), I am looking for a way to stop Google from indexing those domains at all (they are used for technical purposes, and not for users to find the sites).
Thoughts?
-
Ah, I see now. Try this out http://moz.com/community/q/block-an-entire-subdomain-with-robots-txt#reply_26992 - basically, when a subdomain is identified, it would pull a different file into the robots.txt location (which would contain the disallow: / syntax)
Read the remaining comments about getting the subdomain removed via GWT.
-
You are correct, but that isn't what I was asking.
user1.bloggingplatform.com and myblog.com point to the same web server files. If I put up a robots.txt on user1.b... I would effectively de-index myblog.com.
The problem we have run accross is that user205.bloggingplatform.com might be doing something shady, but instead of de-listing the subdomain google kills the primary domain from the index as well.
Because user205.bloggingplatform.com should only be used for technical reasons, and not be in Google's index I am looking for a way to tell google not to index the sub-domain.
I think the better way to solve the problem would be to change the technical subdomain's domain though so change it from user205.bloggingplatform.com to user205.bloggingplatformtesting.com.
Then google can kill that URL all it wants as I don't care.
-
bloggingplatform.com/robots.txt
and
user1.bloggingplatform.com/robots.txt
can and should be different. If you disallow at the subdomain level, only the subdomain will be affected. You can search around for other examples of this but i'm certain it works (we have a development domain that is indexed and create subdomains for all clients that aren't indexed and done via individual robots.txt files)
-
I don't think that works. Since both URLs point to the same server the robots.txt file for the test URL would completely kill the main url.
Or am I missing something?
-
Each subdomain should have a robots.txt file that blocks that specific subdomain. e.g. user1.bloggingplatform.com/robots.txt should have:
User-agent: *
Disallow: /
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Representing categories on my site
My site serves a consumer-focused industry that has about 15-20 well recognized categories, which act as a pretty obvious way to segment our content. Each category supports it's own page (with some useful content) and a series of articles relevant to that category. In short, the categories are pretty focal to what we do. I am moving from DNN to WordPress as my CMS/blog. I am taking the opportunity to review and fix SEO-related issues as I migrate. One such area is my URL structure. On my existing site (on DNN), I have the following types of pages for each topic: / <topic>- this is essentially the landing page for the topic and links to articles</topic> /<topic>/articles/ <article-name>- topics have 3-15 articles with this URL structure</article-name></topic> With WordPress, I am considering moving to articles being under the root. So, an article on (making this up) how to make a widget would be under /how-to-make-a-widget, instead of /<widgets>/article/how-to-make-a-widget I will be using WordPress categories to reflect the topics taxonomy, so I can flag my articles using standard WordPress concepts.</widgets> Anyway, I'm trying to get my head around whether it makes sense to "flatten" my URL structure such that the URLs for each article no longer include the topic (the article page will link to the topic page though). Thoughts?
Technical SEO | | MarkWill1 -
2 sites versus a subdomain: Which is better?
I have a client that sponsors a couple of events during the year. They currently have pages within a single website for these events but are interested in creating a separate website so they can brand the events differently. I'm not sure this is the most effective way to do it for fear of losing the "google juice" already there for these pages.Here's what I'm thinking is a better strategy: 1) Host the content both on the main domain and the sub-domain2) Make sure there is a tag on each page of the sub-domain version that points to the main version.That will give them the branding they are seeking while pushing all juice across to the main domain.What are your thoughts?
Technical SEO | | Britewave0 -
Wordpress Blog Blocked by Metarobots
Upon receiving my first crawl report from new pro SEOMoz acc (yaay!) I've found that the wordpress blog plugged into my site hasn't been getting crawled due to being blocked by metarobots. I'm not a developer and have very little tech expertise, but a search dug up that the issue stemmed from the wordpress site settings > privacy > Ask search engines not to index this site option being selected. On checking the blog "Allow search engines to index this site" was selected so I'm unsure what else to check. My level of expertise means I'm not confident going into the back end of the site and I don't have a tech guy on site to speak to. Has anyone else had this problem? Is it common and will I need to consult a developer to get this fixed? Many thanks in advance for your help!
Technical SEO | | paj19790 -
Subdomains
Hello Seo Experts, Can any one help me with this issue... I do have issues with my subdomains, My site name is http://www.bharatdesi.com, should I have subdomain http://www.bharatdesi.com/hyderabad this way or hyderabad.bharatdesi.com. Please any can answer my question, which way I have to organize my subdomains... and also give me some examples.. Thank you.
Technical SEO | | Vinss0 -
Penalities in a brand new site, Sandbox Time or rather a problem of the site?
Hi guys, 4 weeks ago we launched a site www.adsl-test.it. We just make some article marketing and developed a lots of functionalities to test and share the result of the speed tests runned throug the site. We have been for weeks in 9th google serp page then suddendly for a day (the 29 of february) in the second page next day the website home is disappeared even to brand search like adsl-test. The actual situalion is: it looks like we are not banned (site:www.adsl-test.it is still listed) GWT doesn't show any suggestion and everything looks good for it we are quite high on bing.it and yahoo.it (4th place in the first page) for adsl test search Anybody could help us to understand? Another think that I thought is that we create a single ID for each test that we are running and these test are indexed by google Ex: <cite>www.adsl-test.it/speedtest/w08ZMPKl3R or</cite> <cite>www.adsl-test.it/speedtest/P87t7Z7cd9</cite> Actually the content of these urls are quite different (because the speed measured is different) but, being a badge the other contents in the page are pretty the same. Could be a possible reason? I mean google just think we are creating duplicate content also if they are not effectively duplicated content but just the result of a speed test?
Technical SEO | | codicemigrazione0 -
How does Ping services help your site
Hi i am trying to understand how services such as pingler.com help your site. I think i understand about the google ping service which tells google that you have updated a page but how does pingler work. Pingler claims that it sends traffic to your site but i do not understand this. Any help would be great
Technical SEO | | ClaireH-1848861 -
Google not visiting my site
Hi my site www.in2town.co.uk which is a lifestyle magazine has gone under a major refit. I am still working on it but it should be ready by the end of this week or sooner but one problem i have is, google is not visiting the site. I took a huge gamble to redo the site, even though before the refit i was getting a few thousand visitors a day, i wanted to make the site better as i was getting google webmaster errors. But now it seems google is not visiting the site. for example i am using sh404sef and i have put friendly url in the site and on the home page it has its name and meta tag but when you look at google it is not giving the site a name. Also it has not visited the site since october 13th Can anyone advise how to encourage google to visit the site please.
Technical SEO | | ClaireH-1848860 -
Switching subdomains
A few years ago our company decided to merge all its websites (magazine brands) as sub-domains under one new root domain. So the current situation is like this: brand1.rootdomain.com
Technical SEO | | WDN
brand2.rootdomain.com
brand3.rootdomain.com
... For the moment the rootdomain has a domain authority of 66. In a few weeks we would like to switch that rootdomain to the strongest (highest trust, pagerank,...) brand. So we get this: www.brand1.com
brand2.brand1.com
brand3.brand1.com Before we make the switch i'll have to make a pro and con list. So I hope I can get some advice for you guys if this is a good idea or not.0