Why is old site not being deindexed post-migration?
-
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
-
Jarred,
Whenever you move to a new domain name Google will keep the old domain name indexed for up to a year (or longer!). It's just the way that Google does it, I suspect that it's because you may change your mind and go back to the old domain.
Having the old domain indexed in Google isn't a problem, as users should be redirected to the content on the new domain.
It will take up to a year for Google to stop indexing the old domain.
By the way, make sure you use the Google Change of Address Tool in Google Search Console, it will really help.
-
We did some 301 redirects in early February. There are still some pages on the old domain hanging in the SERPs - however, the 301s are sending the traffic to the right place.
The more powerful your domain, the longer it can take for the pages to drop from the SERPs, because you have a lot of spiders coming in through existing links. Also, weak domains can take a long time to drop from the SERPs - they have lots of pages that are rarely crawled.
-
Hi Jarred,
With regards to advice on this topic - what are you trying to accomplish?
Is the issue that you are using the same content for both sites and are worried about duplicate content?
If this is the case, a 301 redirect should solve your problems.
Have you stopped hosting on the old site?
If not it still exists as far as Google is concerned and you aren't going to see a de-indexation. Even if you have stopped hosting it can take months for Google to realize the site isn't there. Normally you start by seeing a few pages developing 400 errors before being removed completely. This isn't ideal as you are losing the link profile for these pages, hence the value of 301's.
Is it a 301 redirect situation?
If you are redirecting to the new domain, you are not going to de-index the old one. As far as Google is concerned it still exists and will continue to exist as long as it retains hosting.
In addition to above, de-indexation of a website can take months. We had this issue with a client we were transferring 300 domains for and it took about 2-3 months to see Google recognize the pages from the new websites and disregard the old ones. That being said, we were conducting redirects and the old pages never truly disappeared or de-indexed.
In short, 16 days probably isn't a long enough time frame to see any significant changes - if you are using 301's, this change won't happen at all. It doesn't mean anything negative from what you've described here.
If you want to fill me in on more details I'm happy to help as best I can.
Cheers,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Site Links question
Are Google site links only ever shown on the top website? Or is it possible for certain queries for the site in position #2 or #3 or something to have site links but the #1 position not have them? If there are any guides, tips or write ups regarding site links and their behavior and optimization please share! Thanks.
Algorithm Updates | | IrvCo_Interactive0 -
Site not in Google top 50 for key terms
Dear Moz Community, Our site - http://www.sportsdirectnews.com publishes a high volume of daily sport stories and aims to follow Google's Webmaster Guidelines, yet our pages don't appear anywhere in Google's SERP's. We've looked in details at the issue and think it could be something to do with: a) Unusual links or b) High page loading time or c) Too many on-page links If you could have a look at the site - http://www.sportsdirectnews.com - and give your professional opinion as to why our website is not appearing in SERP's, we would be most appreciative. SDN
Algorithm Updates | | BoomDialogue690 -
Domain Change: Leave The Old Domain Homepage Up
We are going to be redesigning our website and switching to a new domain. I think we will set up a permanent 301 redirect from each page of the old domain to a page on the new domain. We would like to leave the old domain homepage up with all content removed and have a link pointing to the new domain. Is there any SEO harm to leaving the old domain homepage up? Thank you! Jessie
Algorithm Updates | | JessieT0 -
With MATT telling PR gone which factor tells now site is good
MATT CUTTS in his like second last video told the world.Guys turn off PR in your Browser.If PR is no longer have value than what an SEO professional needs to know is the site good or bad. 1.Domain authority. 2.alexa 3.SEMRUSH rank 4.compete. So guys need your advice about it.
Algorithm Updates | | csfarnsworth0 -
Domain Deindexed because of Redirect
I think this is an interesting topic to discuss though I'm looking for answers too. One of my well performing domain deindexed by Google today. Reason: Redirect from a 9 year old Deindexed domain (Must be penalysed) I believe this is done by one of my competitor. What you people suggest me to do now? Don't you think if this is the way Google treat the redirects after Penguin anybody can use this technique to harm their competitors?
Algorithm Updates | | HeIsHere0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1 -
How do you block incoming links to your site?
With the new update to google focusing on link spam and multiple anchor text ? If you have incoming links that you would like to block or make no follow?
Algorithm Updates | | HelpingHandNetwork1 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0