My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Please advise.
-
My beta site (beta.website.com) has been inadvertently indexed. Its cached pages are taking traffic away from our real website (website.com). Should I just "NO INDEX" the entire beta site and if so, what's the best way to do this? Are there any other precautions I should be taking? Please advise.
-
On your beta sites in future, I would recommend using Basic HTTP Authentication so that spiders can't even access it (this is for Apache):
AuthUserFile /var/www/sites/passwdfile
AuthName "Beta Realm"
AuthType Basic
require valid-user
Then htpasswd -m /var/www/sites/passwdfile usernameIf you do this as well, Google's Removal Tool will go "ok its not there I should remove the page" as well, because they usually ask for content in the page as a check for removal. If you don't remove the text, they MAY not process the removal request (even if it has noindex [though I don't know if that's the case]).
-
-
In Webmaster Tools, set the subdomain up as its own site and verify it
-
Put on the robots.txt for the subdomain (beta.website.com/robots.txt
User-agent: *
Disallow: / -
You can then submit this site for removal in Google Webmaster Tools
- Click "optimization" and then "remove URLs"
- Click "create a new removal request"
- Type the URL "http://beta.website.com/" in there
- Click "continue"
- Click "submit request".
-
-
Agreed on all counts with Mark. In addition, if you haven't done this already, make sure you have canonical tags in place on your pages. Good luck!
-
You can add noindex to the whole subdomain, and then wait for the crawlers to remove it.
Or you can register the subdomain with webmaster tools, block the subdomain via the robots.txt with a general Disallow: / for the entire subdomain, and then use the URL removal tool in Webmaster Tools to remove the subdomain via robots.txt. Just a robots.txt block won't work - it won't remove the pages, it'll just prevent them from being crawled again.
In your case, I would probably go the route of the robots.txt / url removal tool. This will work to remove the pages from Google. Once this has happened, I would use the noindex tag on the whole subdomain and remove the robots.txt block - this way, all search engines should not index the page / will remove it from their index.
Mark
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Sub domain? Micro site? What's the best solution?
My client currently has two websites to promote their art galleries in different parts of the country. They have bought a new domain (let's call it buyart.com) which they would eventually like to use as an e-commerce platform. They are wondering whether they keep their existing two gallery websites (non e-commerce) separate as they always have been, or somehow combine these into the new domain and have one overarching brand (buyart.com). I've read a bit on subdomains and microsites but am unsure at this stage what the best option would be, and what the pros and cons are. My feeling is to bring it all together under buyart.com so everything is in one place and creates a better user journey for anyone who would like to visit. Thoughts?
Technical SEO | | WhitewallGlasgow0 -
Best practices for types of pages not to index
Trying to better understand best practices for when and when not use a content="noindex". Are there certain types of pages that we shouldn't want Google to index? Contact form pages, privacy policy pages, internal search pages, archive pages (using wordpress). Any thoughts would be appreciated.
Technical SEO | | RichHamilton_qcs0 -
Site address change: new site isn't showing up in Google, old site is gone.
We just transitioned mccacompanies.com to confluentstrategies.com. The problem is that when I search for the old name, the old website doesn't come up anymore to redirect people to the new site. On the local card, Google has even taken off the website altogether. (I'm currently still trying to gain access to manage the business listing) When I search for confluent strategies, the website doesn't come up at all. But if I use the site: operator, it is in the index. Basically, my client has effectively disappeared off the face of the Google. (In doing other name changes, this has never happened to me before) What can I do?
Technical SEO | | MichaelGregory0 -
My old URL's are still indexing when I have redirected all of them, why is this happening?
I have built a new website and have redirected all my old URL's to their new ones but for some reason Google is still indexing the old URL's. Also, the page authority for all of my pages has dropped to 1 (apart from the homepage) but before they were between 12 to 15. Can anyone help me with this?
Technical SEO | | One2OneDigital0 -
What's our easiest, quickest "win" for page load speed?
This is a follow up question to an earlier thread located here: http://www.seomoz.org/q/we-just-fixed-a-meta-refresh-unified-our-link-profile-and-now-our-rankings-are-going-crazy In that thread, Dr. Pete Meyers said "You'd really be better off getting all that script into external files." Our IT Director is willing to spend time working on this, but he believes it is a complicated process because each script must be evaluated to determine which ones are needed "pre" page load and which ones can be loaded "post." Our IT Director went on to say that he believes the quickest "win" we could get would be to move our SSL javascript for our SSL icon (in our site footer) to an internal page, and just link to that page from an image of the icon in the footer. He says this javascript, more than any other, slows our page down. My question is two parts: 1. How can I verify that this javascript is indeed, a major culprit of our page load speed? 2. Is it possible that it is slow because so many styles have been applied to the surrounding area? In other words, if I stripped out the "Secured by" text and all the syles associated with that, could that effect the efficiency of the script? 3. Are there any negatives to moving that javascript to an interior landing page, leaving the icon as an image in the footer and linking to the new page? Any thoughts, suggestions, comments, etc. are greatly appreciated! Dana
Technical SEO | | danatanseo0 -
Will syndicated content hurt a website's ranking potential?
I work with a number of independent insurance agencies across the United States. All of these agencies have setup their websites through one preferred insurance provider. The websites are customizable to a point, but the content for the entire website is mostly the same. Therefore, literally hundreds of agency sites have essentially the same content. The only thing that changes is a few "wildcards" in the copy where the agency fills in their city, state, services areas, company history, etc. My questions is: will this syndicated content hurt their ranking potential? I've been toying with the idea of further editing the content to make it more unique to an agency, but I would hate to waste a lot of hours doing this if it won't help anything. Would you expect this approach to be beneficial or a waste of time? Thank you for your help!
Technical SEO | | copyjack0 -
Client's site dropped completely for all keywords, but not brand name - not manual penalty... help!
We just picked up a new search client a few weeks ago. They've been a customer (we're an automotive dealer website provider) since October of 2011. Their content was very generic (came from the previous provider), so we did a quick once-over as soon as he signed up. Beefed up his page content, made it more unique and relevant... tweaked title tags... wrote meta descriptions (he had none). In just over a week, he went from ranking on page 4 or 5 for his terms to ranking on page 2 or 3. My team was working on getting his social media set up, set up his blog, started competitor research... And then this last weekend, something happened and he dropped completely from the rankings... He still shows up if you do a site: search, or if you search his exact business name, but for everything else, he's nowhere to be found. His URL is www.ohioautowarehouse.com, business name is "Ohio Auto Warehouse" We filed a reconsideration request on Monday, and just got a reply today that there was no manual penalty. They suggested we check our content, but we know we didn't do anything spammy or blackhat. We hadn't even fully optimized his site yet - we were just finishing up his competitor research and were planning on a full site optimization next week... so we're at a complete loss as to what happened. Also, he's not ranking for any of the vehicles in his inventory. Our vehicle pages always rank on page 1 or 2, depending on how big the city is... you can always search "year make model city" and see our customers' sites (whether they're doing SEO or not). This guy's cars aren't showing up... so we know something is going on... Any help would be a lifesaver. We've been doing this for quite some time now, and we've never had a site get penalized. Since the reconsideration request didn't help, we're not sure what to do...
Technical SEO | | Greg_Gifford0 -
Best way to get SEO friendly URLSs on huge old website
Hi folks Hope someone may be able to help wit this conundrum: A client site runs on old tech (IIS6) and has circa 300,000 pages indexed in Google. Most pages are dynamic with a horrible URL structure such as http://www.domain.com/search/results.aspx?ida=19191&idb=56&idc=2888 and I have been trying to implement rewrites + redirects to get clean URLs and remove some of the duplication that exists, using the IIRF Isapi filter: http://iirf.codeplex.com/ I manage to get a large sample of URLS re-writing and redirecting (on a staging version of the site), but the site then slows to crawl. To imple,ent all URLs woudl be 10x the volume of config. I am starting to wonder if there is a better way: Upgrade to Win 2008 / IIS 7 and use the better URL rewrite functionality included? Rebuild the site entirely (preferably on PHP with a decent URL structure) Accept that the URLS can't be made friendly on a site this size and focus on other aspects Persevere with the IIRF filter config, and hope that the config loads into memory and the site runs at a reasonable speed when live None of the options are great as they either involve lots of work/cost of they involve keeping a site which performs well but could do so much better, with poor URLs. Any thoughts from the great minds in the SEOmoz community appreciated! Cheers Simon
Technical SEO | | SCL-SEO1