Duplicate Content on Multinational Sites?
-
Hi SEOmozers
Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me!
Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market.
They don't want to set up a simple redirect because
a) the .com is UK-hosted
b) there's a number of regional spelling changes that need to be made
However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site?
Any help much appreciated!
Thanks
-
Hi Coolpink,
from what I've understood from your question the potential panorama for your client can be this:
- .com for UK
- .us for USA
- both sites with almost identical content.
If I was you, I would follow these best practices:
- On Google Webmaster Tools I'd specify that domain.com must geotarget UK only. Even though .com domains name are meant to be global, if you order Google to geotarget your site for a specific country, then it should follow your directive even if your domain is a generic domain name;
- Again on Google Webmaster Tools. I'd specify that the domain .us must geotarget the USA only territory. Be aware that .us is the Country Level Domain of United States (as .co.uk is the cTLD of UK), therefore Google should have to geotarget automatically domains with that termination to the USA.
- I don't know the nature of your client's site, but if it is an eCommerce, surely there local signals that you may or must use: currencies (Pounds and Dollars), Addresses, Phone Numbers.
- You write that cannot be merged the US and UK market also because of the regional spelling to change. This is a correct intuition, also in term of International SEO. So, when creating the new .us site, pay attention to this issue and remember to translate to American English those contents that were writting in British English (i.e.: analise > analize... ). These regional differences help a lot Google understanding the target of the site
- A good idea in order to reinforce the fact that the .com site is meant just for the UK market, it should be to add in this site the rel="alternate" hreflang="x" tag this way: <rel="alternate" hreflang="en-us" href="http://www.domain.us">(please go read not at the end)</rel="alternate">
- BING > This page in the Bing’s Webmaster Center Blog (“How to tell Your Website’s Country and Language”) explains quite well what are the best practices to follow in order to have a site ranking in the regional versions of Bing. Actually the Metadata embedded in the document solution is the most important between the ones Bing suggests: should be the one to add in the .us site (target: USA)
Note well: the rel="alternate" hreflang="x" is a "page level tag", not domain.
That means that the home page will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us">(as seen above)</rel="alternate">
That page "A" will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us/page-equivalent-to-A"></rel="alternate">
and so on.
-
As of 3 years ago, Google wasn't filtering ccTLD sites for duplicate content. I haven't found anything indicated this isn't true anymore. Also, Rand had a good Whiteboard Friday on this very subject.
-
My experience was that well-researched copy tailored to the local market perfomed much better for non-brand terms. I don't use the .us tld, but I host all my sites in Norway and Google has not had any problems with my conuntry tlds such as .co.uk, .cn, .kr, etc.
-
Thanks for your reply Knut.
So you would advise against using the same copy?
Also, just to clarify, the .com is going to be the UK site, and they are planning on purchasing .us for the US site. Is this acceptable practice?
-
Even if the .com site is hosted in the UK, Google will figure out that .co.uk is for UK and .com is for the US customers. I manage two such sites, www.ekornes.com/us and www.ekornes.co.uk, and when the content was nearly duplicates we ranked well on brand-specific terms in both countries, but not well on non-brand or brand associated terms. The first thing wou want to do is to make the meta tags unique and follow up with unique content. You'll find that if you do your keyword research well, creating and unique content and tags becomes a lot easier as consumers use different words and lingo in different countries to find your product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How bad is it to have duplicate content across http:// and https:// versions of the site?
A lot of pages on our website are currently indexed on both their http:// and https:// URLs. I realise that this is a duplicate content problem, but how major an issue is this in practice? Also, am I right in saying that the best solution would be to use rel canonical tags to highlight the https pages as the canonical versions?
Technical SEO | | RG_SEO0 -
Do quizzes hurt your site? Thin content?
We did a 10 question quiz awhile back relating to something we were sponsoring, and it had a decent response. However, considering quizzes just aren't that long, does that contribute to making the site's content thin? Obviously, it's not a major problem at the moment, but if we did more of them would this be an issue? If there's no real issue, I'd prefer not to no-index them, but I'd love some feedback to help make the decision. Thanks, Ruben
Technical SEO | | KempRugeLawGroup0 -
Duplicate content question...
I have a high duplicate content issue on my website. However, I'm not sure how to handle or fix this issue. I have 2 different URLs landing to the same page content. http://www.myfitstation.com/tag/vegan/ and http://www.myfitstation.com/tag/raw-food/ .In this situation, I cannot redirect one URL to the other since in the future I will probably be adding additional posts to either the "vegan" tag or the "raw food tag". What is the solution in this case? Thank you
Technical SEO | | myfitstation0 -
Content Duplication and Canonical Tag settings
Hi all, I have a question regarding content duplication.My site has posted one fresh content in the article section and set canonical in the same page for avoiding content duplication._But another webmaster has taken my post and posted the same in his site with canonical as his site url. They have not given to original source as well._May I know how Google will consider these two pages. Which site will be affected with content duplication by Google and how can I solve this issue?If two sites put canonical tags in there own pages for the same content how the search engine will find the original site which posted fresh content. How can we avoid content duplication in this case?
Technical SEO | | zco_seo0 -
A problem with duplicate content
I'm kind of new at this. My crawl anaylsis says that I have a problem with duplicate content. I set the site up so that web sections appear in a folder with an index page as a landing page for that section. The URL would look like: www.myweb.com/section/index.php The crawl analysis says that both that URL and its root: www.myweb.com/section/ have been indexed. So I appear to have a situation where the page has been indexed twice and is a duplicate of itself. What can I do to remedy this? And, what steps should i take to get the pages re-indexed so that this type of duplication is avoided? I hope this makes sense! Any help gratefully received. Iain
Technical SEO | | iain0 -
Are aggregate sites penalised for duplicate page content?
Hi all,We're running a used car search engine (http://autouncle.dk/en/) in Denmark, Sweden and soon Germany. The site works in a conventional search engine way with a search form and pages of search results (car adverts).The nature of car searching entails that the same advert exists on a large number of different urls (because of the many different search criteria and pagination). From my understanding this is problematic because Google will penalize the site for having duplicated content. Since the order of search results is mixed, I assume SEOmoz cannot always identify almost identical pages so the problem is perhaps bigger than what SEOmoz can tell us. In your opinion, what is the best strategy to solve this? We currently use a very simple canonical solution.For the record, besides collecting car adverts AutoUncle provide a lot of value to our large user base (including valuations on all cars) . We're not just another leech adword site. In fact, we don't have a single banner.Thanks in advance!
Technical SEO | | JonasNielsen0 -
Duplicate Content Caused By Blog Filters
We are getting some duplicate content warnings based on our blog. Canonical URL's can work for some of the pages, but most of the duplicate content is caused by blog posts appearing on more than 1 URL. What is the best way to fix this?
Technical SEO | | Marketpath0 -
Duplicate Content
Hi - We are due to launch a .com version of our site, with the ability to put prices into local currency, whereas our .co.uk site will be solely £. If the content on both the .com and .co.uk sites is the same (at product level mainly), will we be penalised? What is the best way to get around this?
Technical SEO | | swgolf1230