Getting subdomains unindexed
-
If i turn an application off displaying a 503 error will that get my site unindexed from search engines?
-
Subdomains can be verified as their own site in GWT. Verify the subdomain in GWT, then put a robots.txt on that subdomain excluding the entire subdomain, then request removal in GWT of that entire subdomain. I've had to remove staging and dev sites a couple of times myself.
A couple of things I've found useful in this situation is to make the robots.txt files for both the dev and live sites read only, so you don't accidentally overwrite one with the other when pushing a site live. You can also sign up for a free tool like Pole Position's Code Monitor that will look at the code of a page (including your robots.txt url) once a day and email you if there are any changes so you can fix the file then go hunt down whoever changed the file.
-
GWT was the first placed i checked unfortunately you can only remove directories or pages. I need entire subdomained sites to be removed (in fact they shouldn't of been indexed in the first place).
We use subdomains for our development testing environment when creating client sites and once the site is approved we push it live replacing the old site. Somehow these testing sites are getting indexed and it may pose a threat to duplicate content on different domains. So i am trying to find a solution to get the subdomains (100's of them) unindexed.
I understand a 301 redirect is best but that isn't really applicable since these test sites still need to be reached by clients.
-
With a robots.txt blocking it, you can then go into Google Webmaster Tools and request removal of that particular page or folder from Google's index.
-
No index tag on it works, and putting up a robots.txt that disallows everyone should work as well.
-
Thanks for the quick reply, i will have to try that. Essentially i am trying to get the site un-indexed but i wasn't sure if a 503 would do the trick.
-
Eventually, but that's the code Google recommends to return when your site is having downtime, so I would expect them to be more lenient towards not removing things right away. I wouldn't expect it to be as efficient as returning a 404 or a 410.
The best way to get content de-indexed is to return a page with a meta noindex tag on it, if you're really keen on getting it removed immediately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am looking for best way to block a domain from getting indexed ?
We have a website http://www.example.co.uk/ which leads to another domain (https://online.example.co.uk/) when a user clicks,in this case let us assume it to be Apply now button on my website page. We are getting meta data issues in crawler errors from this (https://online.example.co.uk/) domain as we are not targeting any meta content on this particular domain. So we are looking to block this domain from getting indexed to clear this errors & does this effect SERP's of this domain (**https://online.example.co.uk/) **if we use no index tag on this domain.
Technical SEO | | Prasadgotteti0 -
When rogerbot tried to crawl my site it gets a 404\. Why?
When rogerbot tries to craw my site it tries http://website.com. My website then tries to redirect to http://www.website.com and is throwing a 404 and ends up not getting crawled. It also throws a 404 when trying to read my robots.txt file for some reason. We allow rogerbot user agent so unsure whats happening here. Is there something weird going on when trying to access my site without the 'www' that is causing the 404? Any insight is helpful here. Thanks,
Technical SEO | | BlakeBooth0 -
Subdomains or Subdirectory for multisite SEO structure?
Hey Mozzers, I work for a startup releasing several apps all within their own niches: hiking, mountain biking, skiing, running etc.. We've decided to go for the Wordpress Multisite route and I was wondering what the best site structure was. For example: Would hike.myapp.com or myapp.com/hike be more beneficial to our growth plans? My thinking follows that of geotargeting strategies for franchises (uk.travel.com etc) and to go for the subdomain option in order to build each individual 'sites' authority because each sport has niche audiences. Or am I talking nonsense? I've read varying advice and thought I'd ask you guys. Cheers, A
Technical SEO | | AdamRob011 -
Is there a good Free tool that will check my entire subdomain for mobility issues?
I've been using the Google tool and going page by page, everything seems great. But I'd really like something that will crawl the entire subdomain and give me a report. Any suggestions?
Technical SEO | | absoauto0 -
Why am I getting millions of links from my root domain to my subdomains?
My site's subdomains (us.example.com, de.example.com, etc.) are showing millions of links (in Google Webmaster Tools) from the root domain. This seems very unnatural to me. Any idea what would be cause this or is this? In addition, I just found out that we deliberately stop googlebot crawling GEO-IP redirects, so that when googlebot tries to crawl our UK, DE, FR, etc. sites, it is not redirected to us.example.com. I'm thinking they may be linked? Thanks for your help!
Technical SEO | | CMcC0 -
Is all in one seo getting rid of my google authorship?
is all in one seo getting rid of my google authorship? I had it before and after installing All in One it is not coming up anymore. I have nothing in the google plus field of the main All in one settings, and I have it inserted in my user->your profile. When I view page source it is there. But it does not show up in the search, see: https://www.google.com/search?q=chicago+nightlife+superstar It even works in the rich snippet tester: http://www.google.com/webmasters/tools/richsnippets?url=http%3A%2F%2Fwww.howlatthemoon.com%2Fdueling_piano_bar%2Fchicago-nightlife-superstar%2F&html= Also my title has changed. It is correct in the tester but not in Google. Please help!!!
Technical SEO | | howlusa0 -
What Would i do to get my site ranking high?
Hello Friends, I need your help please tell me what would I do to get my site ranking high in Google search engine. When I start my work on my site my work blog commenting , social bookmarking, keyword targeting etc.… But now the scene is completely changing. Now I am working on just guest blogging. I don’t understand that what would I do next after the guest blogging. Because I think there is now just one way to promote your site VIA guest blogging. Now please tell me is there any other option to work and get high ranking?
Technical SEO | | KLLC0 -
Best geotargeting strategy: Subdomains or subfolders or country specific domain
How have the relatively recent changes in how G perceives subdomains changed the best route to onsite geotargeting i.e. not building out new country specific sites on country specific and hosted domains and instead developing sub-domains or sub-folders and geo-targeting those via webmaster tools ? In other words, given the recent change in G perception, are sub-domains now a better option than a sub-folder or is there not much in it ? Also if client has a .co.uk and they want to geo-target say France, is the sub-domain/sub-folder route still an option or is the .co.uk still too UK specific, and these options would only work using a .com ? In other words can sites on country specific domains (.co.uk , .fr, .de etc etc) use sub-folders or domains to geo-target other countries or do they have no option other than to develop new country specific (domains/hosting/language) websites ? Any thoughts regarding current best practice in this regard much appreciated. I have seen last Febs WBF which covers geotargeting in depth but the way google perceives subdomains has changed since then Many Thanks Dan
Technical SEO | | Dan-Lawrence0