Duplicate Content on Multinational Sites?
-
Hi SEOmozers
Tried finding a solution to this all morning but can't, so just going to spell it out and hope someone can help me!
Pretty simple, my client has one site www.domain.com. UK-hosted and targeting the UK market. They want to launch www.domain.us, US-hosted and targeting the US market.
They don't want to set up a simple redirect because
a) the .com is UK-hosted
b) there's a number of regional spelling changes that need to be made
However, most of the content on domain.com applies to the US market and they want to copy it onto the new website. Are there ways to get around any duplicate content issues that will arise here? Or is the only answer to simply create completely unique content for the new site?
Any help much appreciated!
Thanks
-
Hi Coolpink,
from what I've understood from your question the potential panorama for your client can be this:
- .com for UK
- .us for USA
- both sites with almost identical content.
If I was you, I would follow these best practices:
- On Google Webmaster Tools I'd specify that domain.com must geotarget UK only. Even though .com domains name are meant to be global, if you order Google to geotarget your site for a specific country, then it should follow your directive even if your domain is a generic domain name;
- Again on Google Webmaster Tools. I'd specify that the domain .us must geotarget the USA only territory. Be aware that .us is the Country Level Domain of United States (as .co.uk is the cTLD of UK), therefore Google should have to geotarget automatically domains with that termination to the USA.
- I don't know the nature of your client's site, but if it is an eCommerce, surely there local signals that you may or must use: currencies (Pounds and Dollars), Addresses, Phone Numbers.
- You write that cannot be merged the US and UK market also because of the regional spelling to change. This is a correct intuition, also in term of International SEO. So, when creating the new .us site, pay attention to this issue and remember to translate to American English those contents that were writting in British English (i.e.: analise > analize... ). These regional differences help a lot Google understanding the target of the site
- A good idea in order to reinforce the fact that the .com site is meant just for the UK market, it should be to add in this site the rel="alternate" hreflang="x" tag this way: <rel="alternate" hreflang="en-us" href="http://www.domain.us">(please go read not at the end)</rel="alternate">
- BING > This page in the Bing’s Webmaster Center Blog (“How to tell Your Website’s Country and Language”) explains quite well what are the best practices to follow in order to have a site ranking in the regional versions of Bing. Actually the Metadata embedded in the document solution is the most important between the ones Bing suggests: should be the one to add in the .us site (target: USA)
Note well: the rel="alternate" hreflang="x" is a "page level tag", not domain.
That means that the home page will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us">(as seen above)</rel="alternate">
That page "A" will have:
<rel="alternate" hreflang="en-us" href="http://www.domain.us/page-equivalent-to-A"></rel="alternate">
and so on.
-
As of 3 years ago, Google wasn't filtering ccTLD sites for duplicate content. I haven't found anything indicated this isn't true anymore. Also, Rand had a good Whiteboard Friday on this very subject.
-
My experience was that well-researched copy tailored to the local market perfomed much better for non-brand terms. I don't use the .us tld, but I host all my sites in Norway and Google has not had any problems with my conuntry tlds such as .co.uk, .cn, .kr, etc.
-
Thanks for your reply Knut.
So you would advise against using the same copy?
Also, just to clarify, the .com is going to be the UK site, and they are planning on purchasing .us for the US site. Is this acceptable practice?
-
Even if the .com site is hosted in the UK, Google will figure out that .co.uk is for UK and .com is for the US customers. I manage two such sites, www.ekornes.com/us and www.ekornes.co.uk, and when the content was nearly duplicates we ranked well on brand-specific terms in both countries, but not well on non-brand or brand associated terms. The first thing wou want to do is to make the meta tags unique and follow up with unique content. You'll find that if you do your keyword research well, creating and unique content and tags becomes a lot easier as consumers use different words and lingo in different countries to find your product.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
New SEO manager needs help! Currently only about 15% of our live sitemap (~4 million url e-commerce site) is actually indexed in Google. What are best practices sitemaps for big sites with a lot of changing content?
In Google Search console 4,218,017 URLs submitted 402,035 URLs indexed what is the best way to troubleshoot? What is best guidance for sitemap indexation of large sites with a lot of changing content? view?usp=sharing
Technical SEO | | Hamish_TM1 -
Is this duplicate content that I should be worried about?
Our product descriptions appear in two places and on one page they appear twice. The best way to illustrate that would be to link you to a search results page that features one product. My duplicate content concern refers to the following, When the customer clicks the product a pop-up is displayed that features the product description (first showing of content) When the customer clicks the 'VIEW PRODUCT' button the product description is shown below the buy buytton (second showing of content), this is to do with the template of the page and is why it is also shown in the pop-up. This product description is then also repeated further down in the tabs (third showing of content). My thoughts are that point 1 doesn't matter as the content isn't being shown from a dedicated URL and it relies on javascript. With regards to point 2, is the fact the same paragraph appears on the page twice a massive issue and a duplicate content problem? Thanks
Technical SEO | | joe-ainswoth0 -
Index.php duplicate content
Hi, new here. Im looking for some help with htaccess file. index.php is showing duplicate content errors with: mysite.com/index.php mysite.com/ mysite.com ive managed to use the following code to remove the www part of the url: IfModule mod_rewrite.c>
Technical SEO | | klsdnflksdnvl
RewriteCond %{HTTPS} !=on
RewriteCond %{HTTP_HOST} ^www.(.+)$ [NC]
RewriteRule ^ http://%1%{REQUEST_URI} [R=301,L] but how can i redirect the mysite.com/index.php and mysite.com/ to mysite.com. Please help0 -
Avoiding Cannibalism and Duplication with content
Hi, For the example I will use a computers e-commerce store... I'm working on creating guides for the store -
Technical SEO | | BeytzNet
How to choose a laptop
How to choose a desktop I believe that each guide will be great on its own and that it answers a specific question (meaning that someone looking for a laptop will search specifically laptop info and the same goes for desktop). This is why I didn't creating a "How to choose a computer" guide. I also want each guide to have all information and not to start sending the user to secondary pages in order to fill in missing info. However, even though there are several details that are different between the laptops and desktops, like importance of weight, screen size etc., a lot of things the checklist (like deciding on how much memory is needed, graphic card, core etc.) are the same. Please advise on how to pursue it. Should I just write two guides and make sure that the same duplicated content ideas are simply written in a different way?0 -
Duplicate content due to csref
Hi, When i go trough my page, i can see that alot of my csref codes result in duplicate content, when SeoMoz run their analysis of my pages. Off course i get important knowledge through my csref codes, but im quite uncertain of how much it effects my SEO-results. Does anyone have any insights in this? Should i be more cautios to use csref-codes or dosent it create problems that are big enough for me to worry about them.
Technical SEO | | Petersen110 -
Duplicate Content - That Old Chestnut!!!
Hi Guys, Hope all is well, I have a question if I may? I have several articles which we have written and I want to try and find out the best way to post these but I have a number of concerns. I am hoping to use the content in an attempt to increase the Kudos of our site by providing quality content and hopefully receiving decent back links. 1. In terms of duplicate content should I only post it on one place or should I post the article in several places? Also where would you say the top 5 or 10 places would be? These are articles on XML, Social Media & Back Links. 2. Can I post the article on another blog or article directory and post it on my websites blog or is this a bad idea? A million thanks for any guidance. Kind Regards, C
Technical SEO | | fenwaymedia0 -
Mod Rewrite question to prevent duplicate content
Hi, I'm having problems with a mod rewrite issue and duplicate content On my website I have Website.com Website.com/directory Website.com/directory/Sub_directory_more_stuff_here Both #1 and #2 are the same page (I can't change this). #3 is different pages. How can I use mod rewrite to to make #2 redirect to #1 so I don't have duplicate content WHILE #3 still works?
Technical SEO | | kat20 -
Help removing duplicate content from the index?
Last week, after a significant drop in traffic, I noticed a subdomain in the index with duplicate content. The main site and subdomain can be found below. http://mobile17.com http://232315.mobile17.com/ I've 301'd everything on the subdomain to the appropriate location on the main site. Problem is, site: searches show me that if the subdomain content is being deindexed, it's happening really slowly. Traffic is still down about 50% in the last week or so... what's the best way to tackle this issue moving forward?
Technical SEO | | ccorlando0