Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Subdomain replaced domain in Google SERP
-
Good morning,
This is my first post. I found many Q&As here that mostly answer my question, but just to be sure we do this right I'm hoping the community can take a peak at my thinking below:
Problem: We are relevant rank #1 for "custom poker chips" for example. We have this development website on a subdomain (http://dev.chiplab.com). On Saturday our live 'chiplab.com' main domain was replaced by 'dev.chiplab.com' in the SERP.
Expected Cause: We did not add NOFOLLOW to the header tag. We also did not DISALLOW the subdomain in the robots.txt. We could have also put the 'dev.chiplab.com' subdomain behind a password wall.
Solution: Add NOFOLLOW header, update robots.txt on subdomain and disallow crawl/index.
Question: If we remove the subdomain from Google using WMT, will this drop us completely from the SERP? In other words, we would ideally like our root chiplab.com domain to replace the subdomain to get us back to where we were before Saturday. If the removal tool in WMT just removes the link completely, then is the only solution to wait until the site is recrawled and reindexed and hope the root chiplab.com domain ranks in place of the subdomain again?
Thank you for your time,
Chase
-
Hi Chase,
Removing dev via web master tools should do the trick for now. Then since google won't get to dev anymore you should be safe.
Adding both noindex and password protection is not needed. Since it's password protected Google won't get to see the noindex on the pages. So you should only do one of the two. No need to change now. The password protection is safe.
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right?
*** Yes, that's not possible so you are good.
Only 301 redirections are "mandatory" for Google to pass equity - so all good.
-
No worries, that's what this community is here for!
Google views subdomains as different entities. They have different authority metrics and therefore different ranking power. Removing a URL on a subdomain won't have any affect on it's brother over on a different subdomain (for example: dev. and www.).
Good call to keep the disallow: / on the dev.chiplab.com/robots.txt file - I forgot to mention that you should leave it there, for anti-crawling purpose.
This is the query you'll want to keep an eye on. The info: operator is new and can be used to show you what Google has indexed as your 'canonical' homepage.
-
Hi Logan,
Last follow-up. I swear.

Since I'm pretty new to this I got scared and cancelled the 'dev.chiplab.com' link removal request. I did this because I didn't want to go up 14 days without any traffic (this is the estimated time I found that the Google SERP can take to be updated even though we "fetched as GoogleBot in GWT). May be wrong on the SERP update time?
So what I did was add a 301 permanent redirect from 'dev.chiplab.com' to 'www.chiplab.com'. I've kept the NOFOLLOW/NOINDEX header on all 'dev' subdomains of course. I've kept the DISALLOW in robots.txt for the dev.chiplab.com site specifically. So now I just plan on doing work in the 'dev' site (because I can't test anything with the redirects happening). And then hopefull in 14 days or so the domain name will change gracefully in the Google SERP from dev.chiplab.com to www.chiplab.com. I did all of this because of how many sales we would lose if it took 14 days to start ranking again for this term. Good?
Best,
Chase
-
You should be all set# I wouldn't worry about link equity, but it certainly wouldn't hurt to keep an eye on your domain authority over the next few days.
-
Hi Logan,
Thanks for fast reply!
We did the following:
- Added NOINDEX on the entire subdomain
- Temporarily removed 'dev.chiplab.com' using Google Webmaster Tools
- Password protected 'dev.chiplab.com'
As expected 'dev.chiplab.com' was removed from the SERP. Now, I'm a bit worried that the link equity was transferred for good to the subdomain from 'www.chiplab.com'. That's not possible, right? Do we now just wait until GoogleBot crawls 'www.chiplab.com' and hope that it is restored to #1?
Thank you for your time (+Shawn, +Matt, +eyqpaq),
Chase
-
noindex would be the easiest way.
Seen some people having the same issue fixing it by adding rel canonical to dev pointing to the new site and so the main site got back step by step with no interruptions...
Cheers.
-
Just like Chase said, noindex your dev site to let the search engines know that it should not show in search. I do this on my dev sites everytime.
-
The most ideal method would be to make the dev page password protected. What I would do is to 301 redirect the dev page to the subsequent correct site pages and then when the SERP refreshes, I'd make the dev site a password protected site.
-
Hi Chase,
Removing the subdomain within Search Console (WMT) will not remove the rest of your WWW URLs. Since you have different properties in Search Console for each, they are treated separately. That removal is only temporary though.
The most sure-fire way to ensure you don't get dev. URLs indexed is to put a NOINDEX tag on that entire subdomain. NOFOLLOW simply means that links on whatever page that tag is on won't be followed by bots.
Remember, crawling and indexing are different things. For example, if on your live www. site you had an absolute link somewhere in the mix that had dev.chiplab.com in it, since you presumably haven't nofollowed your live site, a bot will still access that page. The same situation goes for a robots.txt disallow. That only prevents crawling, not indexing. In theory, a bot can get to a disallowed URL and still index it. See this query for an example.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cleaning up a Spammy Domain VS Starting Fresh with a New Domain
Hi- Can you give me your opinion please... if you look at murrayroofing.com and see the high SPAM score- and the fact that our domain has been put on some spammy sites over the years- Is it better and faster to place higher in google SERP if we create a fresh new domain? My theory is we will spin our wheels trying to get unlisted from alot of those spammy linking sites. And that it would be faster to see results using a fresh new domain rather than trying to clean up the current spammy doamin. Thanks in advance - You guys have been awesome!!
Intermediate & Advanced SEO | | murraycustomhomescom0 -
Subdomain cannibalization
Hi, I am doing the SEO for a webshop, which has a lot of linking and related websites on the same root domain. So the structure is for example: Root domain: example.com
Intermediate & Advanced SEO | | Mat_C
Shop: shop.example.com
Linking websites to shop: courses.example.com, software.example.com,... Do I have to check which keywords these linking websites are already ranking for and choose other keywords for my category and product pages on the webshop? The problem with this could be that the main keywords for the category pages on the webshop are mainly the same as for the other subdomains. The intention is that some people immediately come to the webshop instead of going first to the linking websites and then to the webshop. Thanks.0 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
Blog subdomain not redirecting
Over the last few weeks I have been focused on fixing high and medium priority issues, as reported by the Moz crawler, after a recent transition to WordPress. I've made great progress, getting the high priority issues down from several hundred (various reasons, but many duplicates for things like non-www and www versions) to just five last week. And then there's this weeks report. For reasons I can't fathom, I am suddenly getting hundreds of duplicate content pages of the form http://blog.<domain>.com</domain> (being duplicates with the http://www.<domain>.com</domain> versions). I'm really unclear on why these suddenly appeared. I host my own WordPress site ie WordPress.org stuff. In Options / General everything refers to http://www.<domain>.com</domain> and has done for a number of weeks. I have no idea why the blog versions of the pages have suddenly appeared. FWIW, the non-www version of my pages still redirect to the www version, as I would expect. I'm obviously pretty concerned by this so any pointers greatly appreciated. Thanks. Mark
Intermediate & Advanced SEO | | MarkWill0 -
Should I redirect my Google Update Effected Domain to brand new Domain?
Hey Moz experts, I had a domain which was really doing better but after the Humming Bird update my traffic was decreased up to 90%. There are plenty of posts on my existing blog, Now what should I do? I mean should I redirect it to a brand new domain or Copy all the posts to a brand new domain and delete my existing domain? Note that the Old domain has PR1, DA 19 and PA 30.
Intermediate & Advanced SEO | | imran20780 -
Will Google View Using Google Translate As Duplicate?
If I have a page in English, which exist on 100 other websites, we have a case where my website has duplicate content. What if I use Google Translate to translate the page from English to Japanese, as the only website doing this translation will my page get credit for producing original content? Or, will Google view my page as duplicate content, because Google can tell it is translated from an original English page, which runs on 100+ different websites, since Google Translate is Google's own software?
Intermediate & Advanced SEO | | khi50 -
Is it safe to 301 redirect old domain to new domain after a manual unnatural links penalty?
I have recently taken on a client that has been manually penalised for spammy link building by two previous SEOs. Having just read this excellent discussion, http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience I am weighing up the odds of whether it's better to cut losses and recommend moving domains. I had thought under these circumstances it was important not to 301 the old domain to the new domain but the author (Lewis Sellers) comments on 3/4/13 that he is aware of forwards having been implemented without transferring the penalty to the new domain. http://www.seomoz.org/blog/lifting-a-manual-penalty-given-by-google-personal-experience#jtc216689 Is it safe to 301? What's the latest thinking?
Intermediate & Advanced SEO | | Ewan.Kennedy0 -
New Site: Use Aged Domain Name or Buy New Domain Name?
Hi,
Intermediate & Advanced SEO | | peterwhitewebdesign
I have the opportunity to build a new website and use a domain name that is older than 5 years or buy a new domain name. The aged domain name is a .net and includes a keyword.
The new domain would include the same keyword as well as the U.S. state abbreviation. Which one would you use and why? Thanks for your help!0