Why is old site not being deindexed post-migration?
-
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
-
Jarred,
Whenever you move to a new domain name Google will keep the old domain name indexed for up to a year (or longer!). It's just the way that Google does it, I suspect that it's because you may change your mind and go back to the old domain.
Having the old domain indexed in Google isn't a problem, as users should be redirected to the content on the new domain.
It will take up to a year for Google to stop indexing the old domain.
By the way, make sure you use the Google Change of Address Tool in Google Search Console, it will really help.
-
We did some 301 redirects in early February. There are still some pages on the old domain hanging in the SERPs - however, the 301s are sending the traffic to the right place.
The more powerful your domain, the longer it can take for the pages to drop from the SERPs, because you have a lot of spiders coming in through existing links. Also, weak domains can take a long time to drop from the SERPs - they have lots of pages that are rarely crawled.
-
Hi Jarred,
With regards to advice on this topic - what are you trying to accomplish?
Is the issue that you are using the same content for both sites and are worried about duplicate content?
If this is the case, a 301 redirect should solve your problems.
Have you stopped hosting on the old site?
If not it still exists as far as Google is concerned and you aren't going to see a de-indexation. Even if you have stopped hosting it can take months for Google to realize the site isn't there. Normally you start by seeing a few pages developing 400 errors before being removed completely. This isn't ideal as you are losing the link profile for these pages, hence the value of 301's.
Is it a 301 redirect situation?
If you are redirecting to the new domain, you are not going to de-index the old one. As far as Google is concerned it still exists and will continue to exist as long as it retains hosting.
In addition to above, de-indexation of a website can take months. We had this issue with a client we were transferring 300 domains for and it took about 2-3 months to see Google recognize the pages from the new websites and disregard the old ones. That being said, we were conducting redirects and the old pages never truly disappeared or de-indexed.
In short, 16 days probably isn't a long enough time frame to see any significant changes - if you are using 301's, this change won't happen at all. It doesn't mean anything negative from what you've described here.
If you want to fill me in on more details I'm happy to help as best I can.
Cheers,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can site blocked for US visitors rank well internationally?
Because of regulatory reasons, a stock trading site needs to be blocked to United States visitors Since most of google datacenters seem to be located in the US, can this site rank well in the other countries where does business despite being blocked in the US? Do U.S. Google data centers influence only US rankings?
Algorithm Updates | | tabwebman0 -
Embedded site on directory from other country
Dear all, With Google search console I found my site embedded on some directories from other countries, with 1000 links to my site. E.g.: http://www.lmn24.com/it/go-scoopy-2714.html My question is: should I remove my embedded site on this directories? should I remove my embedded site if these directories have good DA (domain authority)?
Algorithm Updates | | Tormar0 -
New site or subdomain
what are pros and cons of launching a new product site as opposed to placing it under a subdomain of the company site? will the new site be placed in the google sandbox? the main goal is to provide credibility for the product, and by placing it under the company site that has been live for over 10 years. It is not a consumer product - more dealers. So people would be pushed to the site or find it through the brochure.
Algorithm Updates | | bakergraphix_yahoo.com0 -
Why does my site dissappeare from the top 50?
Hellow I am having some problems with my site www.kondomanija.si. It was ranked on the first page for my main KW kondomi (in www.google.si, Slovenia) but now it is not in the top 10 pages. And this has happened before, it drops out of the top 10 pages and in a cople of moths it is back for a short time (till it drops out again). It think the site has a week link profile... Could this be the reason? Does anybody know what is going on?
Algorithm Updates | | Spletnafuzija0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Does Google do domain level topic modeling? If so, are off-site factors such as search traffic volume taken into account?
80% of my site's organic traffic is coming through a resource that is only somewhat related. Does Google think the main topic of my site is terms this resource targets thus bumping the terms I care about to a sub-topic level of sorts? If this is the case, would putting the resource information into a sub-domain help to solve the problem?
Algorithm Updates | | tatermarketing0 -
Google seems to have penalised one section of our site? Is that possible?
We have a page rank 5 website and we launched a new site 6 months ago in February. Initially we had horrible urls with a bunch of numbers and stuff and we since changed them to lovely human readable urls. This had an excellent effect across the site except on one section of the site: http://www.allaboutcareers.com/careers/graduate-employers Although Google has indexed these pages and several have a PR 2 they do not appear in Google when previously they were on page 1 when we had the old urls. We figured we just needed some time for Google to get used to it, but it hasn't done anything. It is also worth mentioning we changed the page titles from: FIRM NAME | DOMAIN NAME then... FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships | DOMAIN NAME then.. FIRM NAME | Graduate Scheme, Jobs, Internships & Apprenticeships Do you think these are being penalised? There are two types of page: Example A: http://www.allaboutcareers.com/careers/graduates/addleshaw-goddard.htm Example B: http://www.allaboutcareers.com/careers/graduates/accenture.htm
Algorithm Updates | | jack860 -
Google said that low-quality pages on your site may affect rankings on other parts
One of my sites got hit pretty hard during the latest Google update. It lost about 30-40% of its US traffic and the future does not look bright considering that Google plans a worldwide roll-out. Problem is, my site is a six year old heavy linked, popular Wordpress blog. I do not know why the article believes that it is low quality. The only reason I came up with is the statement that low-quality pages on a site may affect other pages (think it was in the Wired article). If that is so, would you recommend blocking and de-indexing of Wordpress tag, archive and category pages from the Google index? Or would you suggest to wait a bit more before doing something that drastically. Or do you have another idea what I could to do? I invite you to take a look at the site www.ghacks.net
Algorithm Updates | | badabing0