Getting subdomains unindexed
-
If i turn an application off displaying a 503 error will that get my site unindexed from search engines?
-
Subdomains can be verified as their own site in GWT. Verify the subdomain in GWT, then put a robots.txt on that subdomain excluding the entire subdomain, then request removal in GWT of that entire subdomain. I've had to remove staging and dev sites a couple of times myself.
A couple of things I've found useful in this situation is to make the robots.txt files for both the dev and live sites read only, so you don't accidentally overwrite one with the other when pushing a site live. You can also sign up for a free tool like Pole Position's Code Monitor that will look at the code of a page (including your robots.txt url) once a day and email you if there are any changes so you can fix the file then go hunt down whoever changed the file.
-
GWT was the first placed i checked unfortunately you can only remove directories or pages. I need entire subdomained sites to be removed (in fact they shouldn't of been indexed in the first place).
We use subdomains for our development testing environment when creating client sites and once the site is approved we push it live replacing the old site. Somehow these testing sites are getting indexed and it may pose a threat to duplicate content on different domains. So i am trying to find a solution to get the subdomains (100's of them) unindexed.
I understand a 301 redirect is best but that isn't really applicable since these test sites still need to be reached by clients.
-
With a robots.txt blocking it, you can then go into Google Webmaster Tools and request removal of that particular page or folder from Google's index.
-
No index tag on it works, and putting up a robots.txt that disallows everyone should work as well.
-
Thanks for the quick reply, i will have to try that. Essentially i am trying to get the site un-indexed but i wasn't sure if a 503 would do the trick.
-
Eventually, but that's the code Google recommends to return when your site is having downtime, so I would expect them to be more lenient towards not removing things right away. I wouldn't expect it to be as efficient as returning a 404 or a 410.
The best way to get content de-indexed is to return a page with a meta noindex tag on it, if you're really keen on getting it removed immediately.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I am getting an error message from Google Webmaster Tools and I don't know what to do to correct the problem
The message is:
Technical SEO | | whitegyr
"Dear site owner or webmaster of http://www.whitegyr.com/, We've detected that some of your site's pages may be using techniques that are outside Google's Webmaster Guidelines. If you have any questions about how to resolve this issue, please see our Webmaster Help Forum for support. Sincerely, Google Search Quality Team" I have always tried to follow Google's guidelines and don't know what I am doing wrong, I have eight different websites all getting this warning and I don't know what is wrong, is there anyone you know that will look at my sites and advise me what I need to do to correct the problem? Website with this warning:
artistalaska.com
cosmeticshandbook.com
homewindpower.ws
montanalandsale.com
outdoorpizzaoven.net
shoes-place.com
silverstatepost.com
www.whitegyr.com0 -
Get rid of a large amount of 404 errors
Hi all, The problem:Google pointed out to me that I have a large increase of 404 errors. In short I had software before that created pages (automated) for long tale search terms and feeded them to google. Recently I quit this service and all those pages (about 500000) were deleted. Now google GWM points out about 800000 404 errors. What I noticed: I had a large amount of 404's before when I changed my website. I fixed it (proper 302) and as soon as all the 404's in GWM were gone I had around 200 visitors a day more. It seems that a clean site is better positioned. Anybody any suggestion on how to tell google that all urls starting with www.domain/webdir/ should be deleted from cache?
Technical SEO | | hometextileshop0 -
How do you get a Google+ pic in your SERP snippet
Hi from from 20 degrees C 83% humidity wetherby UK 🙂 A few weeks back i decided i needed to get my pretty face appearing in my serps for www.davidclick.com But after having set up a Gppgle+ account and linking my site to the Google+ account i think I may have done something wrong 😞 I linked to the Google+ page via a footer link in www.davidclick.com but alas I'm not able to get my face in my SERP which this website has: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/google-plus-picJPGcopy.jpg So my question is please - "How do you get your Google+ account image to appear in the SERPS. Ta muchly,
Technical SEO | | Nightwing
David0 -
To get homepage to rank highly for a given term, do you only add the term(and variations) to the homepage?
If adding the term on all pages then won't the pages start competing with each other in the SERP? I'm building a site for a client who wants to rank highly for 'Southern California general contractors' and its not clear to me whether to fit these keywords (and variations) only into the homepage or if I should try to fit them into all pages.
Technical SEO | | kennysmith0 -
How to block google robots from a subdomain
I have a subdomain that lets me preview the changes I put on my site. The live site URL is www.site.com, working preview version is www.site.edit.com The contents on both are almost identical I want to block the preview version (www.site.edit.com) from Google Robots, so that they don't penalize me for duplicated content. Is it the right way to do it: User-Agent: * Disallow: .edit.com/*
Technical SEO | | Alexey_mindvalley0 -
Subdomain Removal in Robots.txt with Conditional Logic??
I would like to see if there is a way to add conditional logic to the robots.txt file so that when we push from DEV to PRODUCTION and the robots.txt file is pushed, we don't have to remember to NOT push the robots.txt file OR edit it when it goes live. My specific situation is this: I have www.website.com, dev.website.com and new.website.com and somehow google has indexed the DEV.website.com and NEW.website.com and I'd like these to be removed from google's index as they are causing duplicate content. Should I: a) add 2 new GWT entries for DEV.website.com and NEW.website.com and VERIFY ownership - if I do this, then when the files are pushed to LIVE won't the files contain the VERIFY META CODE for the DEV version even though it's now LIVE? (hope that makes sense) b) write a robots.txt file that specifies "DISALLOW: DEV.website.com/" is that possible? I have only seen examples of DISALLOW with a "/" in the beginning... Hope this makes sense, can really use the help! I'm on a Windows Server 2008 box running ColdFusion websites.
Technical SEO | | ErnieB0 -
How to get user genreated reviews indexed properly?
We are currently working to improve the deployment of a review widget on our website. The widget was deployed about 18 months ago and all reviews are behind Java navigation. I have been working with our IT staff to get the reviews into an HTML page which will either live on the product page as a tab or will be a link from the product page. Our IT staff has suggested leaving the Java navigation for users and creating separate HTML pages specifically for search engines. Based on my experience, this sounds like a bad idea, basically creating pages just for search engines that will not be use by site visitors, although the visitors will have access to the same content via the Java navigation. Anyone care to comment on this? Is creating HTML pages specifically for search engines a bad idea? An acceptable idea?
Technical SEO | | seorunner0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | | Red_Mud_Rookie0