Best way to move the content to a different domain without inviting any SERP penalty?
-
Hi all,
We are in a bit of a fix right now. We have around 60-70 articles (Wordpress pages / posts) that we intend to move to another domain of ours. What's the best way to do so such that we do not invite any Google penalty.
Here's a detailed information about our case:
Let's say, our site example.com has more 2000 articles. To help us better position our content for one of the sections on example.com, we have started another website, example2.com and want to move those 60-70 articles from example.com to example2.com.What is the best way to do it such that we are not penalised by Google? Is it (a) Move all the said content (60-70 articles) from example.com to example2.com and (b) do a permanent redirect (301) of each of the older article URLs to newer article URLs.
What are the other options?
-
Thanks Tom and Donford.. That helped a lot.
We just posted a follow-up question to this over here - http://moz.com/community/q/is-it-possible-to-move-a-couple-of-posts-and-comments-from-one-wp-domain-to-another
-
100% agree with Tom. There really is nothing I can think to add.
-
Hi Shalin
You've already listed the best option available.
Add those articles on your new website and then 301 the old pages to the new ones. That is the best possible solution.
Alternatively, you could change the canonical tags on the old website to point to the new website - ie on www.oldsite.com/article-1 you have this canonical tag: and so on. However, this is a longer way of doing the same thing as a 301, while the effect may not be instant either.
A third option would be to add a noindex,nofollow tag on the old site pages to deindex the pages.
The last two solutions would let you keep the articles on both sites - but in this case I would not recommend it, because even with the canonical change and/or noindexing, it could be misinterpreted as you looking to flesh out your site with more content that isn't original. I'd recommend the 301 redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Description tag in code is different from what is shown in SERPS...
Hi there: We have a client whose website we built in WP, using Yoast Pro as our SEO plugin. I was reading some reports (actually coming out of SEMrush but we use Moz as well) and I am getting really varying results in the description are of the SERPS. Even though I'm seeing the copy we wrote in Yoast in the description tag code, the SERP is showing an excerpt from the copywriting on the site. What's even weirder is that SEMrush is pulling an entirely DIFFERENT description. I'm obviously missing out on the finer points of description tags, as Google clearly does not always choose to feature what is actually written in the description tag itself. Can someone explain to me what might be going on here? Thanks in advance,
Intermediate & Advanced SEO | | Daaveey1 -
Duplicate Content Question With New Domain
Hey Everyone, I hope your day is going well. I have a question regarding duplicate content. Let's say that we have Website A and Website B. Website A is a directory for multiple stores & brands. Website B is a new domain that will satisfy the delivery niche for these multiple stores & brands (where they can click on a "Delivery" anchor on Website A and it'll redirect them to Website B). We want Website B to rank organically when someone types in " <brand>delivery" in Google. Website B has NOT been created yet. The Issue Website B has to be a separate domain than Website A (no getting around this). Website B will also pull all of the content from Website A (menus, reviews, about, etc). Will we face any duplicate content issues on either Website A or Website B in the future? Should we rel=canonical to the main website even though we want Website B to rank organically?</brand>
Intermediate & Advanced SEO | | imjonny0 -
Penalty for adding too much content too quickly?
Hi there, We released around 4000 pieces of new content, which all ranked in the first page and did well. We had a database of ~400,000 pieces and so we released the entire library in a couple of days (all remaining 396,000 pages). The pages have indexed. The pages are not ranking, although the initial batch are still ranking as are a handful (literally a handful) of the new 396,000. When I say not ranking - I mean not ranking anywhere (gone up as far as page 20), yet the initial batch we'd be ranking for competitive terms on page 1. Do Google penalise you for releasing such a volume of content in such a short space of time? If so, should we deindex all that content and re-release in slow batches? And finally, if that is the course of action we should take is there any good articles around deindexing content at scale. Thanks so much for any help you are able to provide. Steve
Intermediate & Advanced SEO | | SteveW19870 -
Best way to handle page filters and sorts
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot. I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL. Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page. What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice: put rel canonical tag on all of the pages with parameters and point to "root" use the google parameter tool and have it not crawl any urls with my parameters put meta no robots on the parameter pages Thanks!
Intermediate & Advanced SEO | | jcgoodrich0 -
Best way to permanently remove URLs from the Google index?
We have several subdomains we use for testing applications. Even if we block with robots.txt, these subdomains still appear to get indexed (though they show as blocked by robots.txt. I've claimed these subdomains and requested permanent removal, but it appears that after a certain time period (6 months)? Google will re-index (and mark them as blocked by robots.txt). What is the best way to permanently remove these from the index? We can't use login to block because our clients want to be able to view these applications without needing to login. What is the next best solution?
Intermediate & Advanced SEO | | nicole.healthline0 -
Best way to help a city-centric service provider market in new nearby territories?
Our client recently acquired new county territories outside the main area city. We could create separate location pages under the primary domain, but are wondering if micro sites with unique content (and location-including url) that links back to the location pages would also be a good idea. There is some traction for certain location-based keywords in those areas. Better to focus on the one domain, or augment with separate websites in different parts of the state? I can come up with plausible reasons for and against either, but would love your thoughts. Thank you for any insight!
Intermediate & Advanced SEO | | PerfectPitchConcepts0 -
What's the best way to manage content that is shared on two sites and keep both sites in search results?
I manage two sites that share some content. Currently we do not use a cross-domain canonical URL and allow both sites to be fully indexed. For business reasons, we want both sites to appear in results and need both to accumulate PR and other SEO/Social metrics. How can I manage the threat of duplicate content and still make sure business needs are met?
Intermediate & Advanced SEO | | BostonWright0 -
Are sub domains considered completely different than the root domain?
We have a project that is going to generate duplicate content. If we move the new content to a sub-domain (E.g. product.domain.com) will it still be considered duplicate content to the root domain? Or is it like having two completely different domains? Thanks!
Intermediate & Advanced SEO | | tripled5110