Why is old site not being deindexed post-migration?
-
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
-
Jarred,
Whenever you move to a new domain name Google will keep the old domain name indexed for up to a year (or longer!). It's just the way that Google does it, I suspect that it's because you may change your mind and go back to the old domain.
Having the old domain indexed in Google isn't a problem, as users should be redirected to the content on the new domain.
It will take up to a year for Google to stop indexing the old domain.
By the way, make sure you use the Google Change of Address Tool in Google Search Console, it will really help.
-
We did some 301 redirects in early February. There are still some pages on the old domain hanging in the SERPs - however, the 301s are sending the traffic to the right place.
The more powerful your domain, the longer it can take for the pages to drop from the SERPs, because you have a lot of spiders coming in through existing links. Also, weak domains can take a long time to drop from the SERPs - they have lots of pages that are rarely crawled.
-
Hi Jarred,
With regards to advice on this topic - what are you trying to accomplish?
Is the issue that you are using the same content for both sites and are worried about duplicate content?
If this is the case, a 301 redirect should solve your problems.
Have you stopped hosting on the old site?
If not it still exists as far as Google is concerned and you aren't going to see a de-indexation. Even if you have stopped hosting it can take months for Google to realize the site isn't there. Normally you start by seeing a few pages developing 400 errors before being removed completely. This isn't ideal as you are losing the link profile for these pages, hence the value of 301's.
Is it a 301 redirect situation?
If you are redirecting to the new domain, you are not going to de-index the old one. As far as Google is concerned it still exists and will continue to exist as long as it retains hosting.
In addition to above, de-indexation of a website can take months. We had this issue with a client we were transferring 300 domains for and it took about 2-3 months to see Google recognize the pages from the new websites and disregard the old ones. That being said, we were conducting redirects and the old pages never truly disappeared or de-indexed.
In short, 16 days probably isn't a long enough time frame to see any significant changes - if you are using 301's, this change won't happen at all. It doesn't mean anything negative from what you've described here.
If you want to fill me in on more details I'm happy to help as best I can.
Cheers,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
If we have all products on-site for indexing, do we get dinged by Google for not transacting on-site?
I am trying to do research on the SEO impact of having an off-site transactional website. For example, Pepsi.com lists all product information on their site but guides visitors to transact on Amazon or Walmart. What impact, if any, does guiding the customer to a separate transactional site have on SEO? In short, if we have all products on-site for indexing, do we get dinged by Google for not transacting on-site?
Algorithm Updates | | KaylaV0 -
Our company is mentioned on some high-traffic, authoritative sites and some of our products are linked as well. If we link to those pages, does it affect our SEO? How can we take advantage of those mentions?
I heard that if you link to another site, when Google indexes your site, they crawl that page that is referenced. By whatever metrics they use, if that site has your name or a link to your site, Google would rank it higher. I am not sure how true that is, but what value does another site mentioned our site have on our SEO?
Algorithm Updates | | JonathonOhayon1 -
My site dropped from 1st to 5th Pagna google br
My site dropped from 1st to 5th Pagna google br mented in the key word, how can I find out why at and what to do to get back? He fell after he put the google analytic code on all pages of the site, it may have acid? Meu site caiu da 1º para a 5º Pagna do google br em tadas as palavra chaves, como posso descobrir o motivo e oque fazer para voltar ? Ele caiu depois que coloquei o codigo do google analytic em todas as paginas do site, pode ter cido isso ?
Algorithm Updates | | Guedes0 -
Test site is live on Google but it duplicates existing site...
Hello - my developer has just put a test site up on Google which duplicates my existing site (main url is www.mydomain.com and he's put it up on www.mydomain.com/test/ "...I’ve added /test/ to the disallowed urls in robots.txt" is how he put it. So all the site URLs are content replicated and live on Google with /test/ added so he can block them in robots. In all other ways the test site duplicates all content, etc (until I get around to making some tweaks next week, that is). Is this a bad idea or should I be OK. Last thing I want is a duplicate content or some other Google penalty just because I'm tweaking an existing website! Thanks in advance, Luke
Algorithm Updates | | McTaggart0 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
Does google have the worst site usability?
Google tells us to make our sites better for our readers, which we are doing, but do you think google has horrible site usabilty? For example, in webmaster tools, I'm always being confused by their changes and the way they just drop things. In the HTML suggestions area, they don't tell you when the data was last updated, so the only way to tell is to download the files and check. In the URL removals, they used to show you the URLs they had removed. Now that is gone and the only way you can check is to try adding one. We don't have any URL parameters, so any parameters are as a result of some other site tacking on stuff at the end of our URL and there is no way to tell them that we don't have any parameters, so ignore them all. Also, they add new parameters they find on the end of the list, so the only way to check is to click through to the end of the list.
Algorithm Updates | | loopyal0 -
Whats the best thing to do after rebuilding a site to get old rankings back ?
A website changed its platform from the old one to magento ecommerce. In webmaster tools google says that yesterday was the last time that crawled the site, but the old rankings for keywords are gone , traffic went down big time and now i'm not sure where to start working in order to bring everything like it was. any advice ?
Algorithm Updates | | footballearnings0