Why is old site not being deindexed post-migration?
-
We recently migrated to a new domain (16 days ago), and the new domain is being indexed at a normal rate (2-3k pages per day). The issue is the old domain has not seen any drop in indexed pages. I was expecting a drop in # of indexed pages inversely related to the increase of indexed pages on the new site. Any advice?
-
Jarred,
Whenever you move to a new domain name Google will keep the old domain name indexed for up to a year (or longer!). It's just the way that Google does it, I suspect that it's because you may change your mind and go back to the old domain.
Having the old domain indexed in Google isn't a problem, as users should be redirected to the content on the new domain.
It will take up to a year for Google to stop indexing the old domain.
By the way, make sure you use the Google Change of Address Tool in Google Search Console, it will really help.
-
We did some 301 redirects in early February. There are still some pages on the old domain hanging in the SERPs - however, the 301s are sending the traffic to the right place.
The more powerful your domain, the longer it can take for the pages to drop from the SERPs, because you have a lot of spiders coming in through existing links. Also, weak domains can take a long time to drop from the SERPs - they have lots of pages that are rarely crawled.
-
Hi Jarred,
With regards to advice on this topic - what are you trying to accomplish?
Is the issue that you are using the same content for both sites and are worried about duplicate content?
If this is the case, a 301 redirect should solve your problems.
Have you stopped hosting on the old site?
If not it still exists as far as Google is concerned and you aren't going to see a de-indexation. Even if you have stopped hosting it can take months for Google to realize the site isn't there. Normally you start by seeing a few pages developing 400 errors before being removed completely. This isn't ideal as you are losing the link profile for these pages, hence the value of 301's.
Is it a 301 redirect situation?
If you are redirecting to the new domain, you are not going to de-index the old one. As far as Google is concerned it still exists and will continue to exist as long as it retains hosting.
In addition to above, de-indexation of a website can take months. We had this issue with a client we were transferring 300 domains for and it took about 2-3 months to see Google recognize the pages from the new websites and disregard the old ones. That being said, we were conducting redirects and the old pages never truly disappeared or de-indexed.
In short, 16 days probably isn't a long enough time frame to see any significant changes - if you are using 301's, this change won't happen at all. It doesn't mean anything negative from what you've described here.
If you want to fill me in on more details I'm happy to help as best I can.
Cheers,
Rob
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do you think profanity in the content can harm a site's rankings?
In my early 20's I authored an ebook that provides men with natural ways to improve their ahem... "bedroom performance". I'm now in my mid 30s, and while it's not such an enthralling topic, the thing makes me 80 or so bucks a day on good days, and it actually works. I update the blog from time to time and build links to it on occasion from good sources. I've carried my SEO knowledge to a more "reputable" business, but this project is still interesting to me, because it's fully mine. I am more interested in getting it to rank and convert than anything, but following the same techniques that are working to grow the other business, this one continues to tank. Disavow bad links, prune thin content.. no difference. However, one thing I just noticed now are my search queries in the reports. When I first started blogging on this, I was real loose with my tongue, and spoke quite frankly (and dirty to various degrees). I'm much more refined and professional in how I write now. However, the queries I'm ranking for... a lot of d words, c words (in the sex sense)... sounds almost pornographic. Think Google may be seeing this, and putting me lower in rankings or in some sort of lower level category because of it? Heard anything about google penalizing for profanity? I guess in this time of authority and trust, that can hurt both of those... but I wonder if anyone's heard any actual confirmation of this or has any experience with this? Thanks!
Algorithm Updates | | DavidCapital0 -
Fresh backlinks vs old backlinks: A solid ranking factor?
Hi Moz community, Backlinks being a major ranking factor, do they must be very recent or fresh to make a ranking difference compared to the backlinks which are years old? We know usually fresh content ranks well, but I wonder how much the fresh/recent backlinks impact in rankings. Do the years old backlinks from related and reputed website have same impact on rankings? Thanks
Algorithm Updates | | vtmoz0 -
Wordpress Blog Integrated into eCommerce site - Should we use one xml sitemap or two?
Hi guys, I wonder whether you can help me with a couple of SEO queries: So we have an ecommerce website (www.exampleecommercesite.com) with its own xml sitemap, which we have submitted to the Google Webmasters Console. However, recently we decided to add a blog to our site for SEO purposes. The blog is on a subdomain of the site such as: blog.exampleecommercesite.com (We wanted to have it as www.exampleecommercesite.com/blog but our server made it very difficult and it wasn't technically possible at the time) 1. Should we add the blog.exampleecommercesite.com as a separate property in the Google Webmaster tools? 2. Should we create a separate xml sitemap for the blog content or are there more benefits in terms of SEO if we have one sitemap for the blog and the ecommerce site? If appreciate your opinions on the topic! Thank you and have a good start of the week!
Algorithm Updates | | Firebox0 -
How do I code SEO for a secondary site without impacting the main site?
We have a secondary site for our online magazine, how do I code the SEO so I don't steal links from the main site?
Algorithm Updates | | gacwebteam0 -
Large number of thin content pages indexed, affect overall site performance?
Hello Community, Question on negative impact of many virtually identical calendar pages indexed. We have a site that is a b2b software product. There are about 150 product-related pages, and another 1,200 or so short articles on industry related topics. In addition, we recently (~4 months ago) had Google index a large number of calendar pages used for webinar schedules. This boosted the indexed pages number shown in Webmaster tools to about 54,000. Since then, we "no-followed" the links on the calendar pages that allow you to view future months, and added "no-index" meta tags to all future month pages (beyond 6 months out). Our number of pages indexed value seems to be dropping, and is now down to 26,000. When you look at Google's report showing pages appearing in response to search queries, a more normal 890 pages appear. Very few calendar pages show up in this report. So, the question that has been raised is: Does a large number of pages in a search index with very thin content (basically blank calendar months) hurt the overall site? One person at the company said that because Panda/Penguin targeted thin-content sites that these pages would cause the performance of this site to drop as well. Thanks for your feedback. Chris
Algorithm Updates | | cogbox0 -
Regarding site url structure
OK so there are already some answers to questions similar to this but mine might be a little more specific. OK website is www.bestlifeint.com Most of our product pages are as such: http://www.bestlifeint.com/products-soy.html for instance. However I was trying to help the SEO for certain pages (namely two) with the URL's and had some success with another page our Soy Meal Replacement I changed the site URL of this page from www.bestlifeint.com/products-meal to www.bestlifeint.com/Soy-Amazing-Meal-Replacement-with-Omega-3s.html (notice I dropped the /product part of url and made it more seo friendly. The old page for this page was something like www.bestlifeint.com/products-meal The issue is that recently this new page and another page I have changed http://www.bestlifeint.com/Whey-Milk-Alternative.html I have dropped the "/product" on the URL even though they are both products. The new Meal Replacement page used to be ranked like 6th on google at the begining of the month and now is like 48th or something. The new "whey milk" page (http://www.bestlifeint.com/Whey-Milk-Alternative.html) is ranked like 45th or something for "Whey Milk" when the old page...."products/wheyrice.html" was ranked around 18th or so at the begining of the month. Have I hurt these two pages by not following www.bestlifeint.com/product.... site structure? And focusing more on the URL SEO? I have both NEW pages receiving all link juice inside web site so they are the new pages (can not go to old page) and recently seeing that google has pretty much dropped the old pages in search rankings I have deleted these two pages. Do i just need to just wait and see? According to my research we should rank much higher for "Whey Milk" we should be on the first page according to googles own statements of searchers finding good relevant material. Any advice moving forward? Thanks, Brian
Algorithm Updates | | SammisBest0 -
Domain Deindexed because of Redirect
I think this is an interesting topic to discuss though I'm looking for answers too. One of my well performing domain deindexed by Google today. Reason: Redirect from a 9 year old Deindexed domain (Must be penalysed) I believe this is done by one of my competitor. What you people suggest me to do now? Don't you think if this is the way Google treat the redirects after Penguin anybody can use this technique to harm their competitors?
Algorithm Updates | | HeIsHere0 -
Best Way to Determine Age of Site
What's the best way to determine the age of a site? Where by it's beginning I mean when it went through the Google Sandbox and has been a functioning site every since. Thanks!
Algorithm Updates | | BobGW0