Geographic site clones and duplicate content penalties
-
We sell wedding garters, niche I know!
We have a site (weddinggarterco.com) that ranks very well in the UK and sell a lot to the USA despite it's rudimentary currency functions (Shopify makes US customers checkout in £gbp; not helpful to conversions).
To improve this I built a clone (theweddinggarterco.com) and have faked a kind of location selector top right. Needless to say a lot of content on this site is VERY similar to the UK version. My questions are...
1. Is this likely to stop me ranking the USA site?
2. Is this likely to harm my UK rankings?
Any thoughts very welcome! Thanks. Mat
-
Well I maybe biased because this is what I wanted to hear but personally I think spot on, particularly the Kissmetrics article from a later response. I have set geo-targeting already and will also sort the HREFLANG tags.
I plan to leave both sites on .com domains - In the UK .com's are just as 'normal' as .co.uk's. All content has been updated to US english and with specific relevant info so I think it's just down to the usual link building and adding content to get it to rank.
I genuinely appreciate all the responses, fantastically useful, thank you!
Mat
-
Hi Dave,
Because it's a bot that's examining the site you need the hreflang & geo-targeting. Algorithms are not perfect, and mistakes do happen, but I am convinced that on the long run you win by staying close to the guidelines (and certainly by putting the benefits of your visitors/customers first).
Personally, I think this whole duplicate content issue is a bit overrated (and I am not only one - check this post on Kissmetrics). In most cases, when finding duplicate content Google will just pick one of the sites to show in the results, and not show the others, unless the duplicate content has a clear intent of spamming. Panda is mainly about thin and/or low quality content, or content duplicated from other sites (without hreflang/geotargeting etc) so I would consider the risk in this case rather low.
There was a discussion on Google product forums which is quite similar to this one (Burberry had a massive traffic drop on it's US site) - and the answer from JohnMu from Google was quite similar to the answer I gave: use geo targeting & hreflang.
rgds,
Dirk
-
I do agree that by the guidelines taken verbatim you could make a good case. My concern is that it's not some guy at Google sitting down and judging sites and asking, "Does this violate the guidelines?" it's a bot and as I'm sure everyone here can attest ... Pandas and Penguin aren't perfect. One can just ask Barry Schwartz of the very credible SE Roundtable about getting hit with a Panda false positive on content issues and about the cost in traffic it causes. Or you can read his post on it here.
Or maybe I'm just paranoid. That could well be.
-
Hi,
I tend to disagree with the answers above. If you check the "official" Google point of view it states: "This (=duplicate content) is generally not a problem as long as the content is for different users in different countries"
So - you should make it obvious that the content is for different users in different countries.
1. Use Webmaster Tools to set the target geography:
- set weddinggarterco.com to UK
- set theweddinggarterco.com to US
You could also consider to put weddinggarterco.com on weddinggarter.co.uk and redirect weddinggarterco.com to the .co.uk version (currently the redirection is the other way round). This way you could leave theweddinggarterco.com without specific geo-target (if you also would target countries like AU)
2. Use the HREFLANG on both sites (on all the pages). You can find a generator here and a tool to check if it's properly implemented here. Other interesting articles on HREFLANG can be found here and here
3. It seems you already adapted a few pages to be more tailored to the US market (shipping, prices) - not sure if you already put the content in US english.
4. I imagine the sites are hosted in the UK. Make sure that the .com version loads fast enough - check both versions on webpagetest.org with US & UK ip's and see if there is a difference in load times. If you're not using it already - consider the use of a CDN
If you do all of the above, you normally should be fine. Hope this helps,
Dirk
-
Hi there.
You can face duplicate content issue. What you can do is to use hreflang or/and canonical links. This would make it all right and would assure that your rankings wouldn't drop.
Cheers.
-
There are always exception to rules but for safety I would highly recommend blocking the .com site until you can get some real unique content on it. It does stand a high chance of taking it's own devaluation (almost certain) and may impact the .co.uk site (and really ... why risk it).
If the scenario was mind I'd have simply built in customized pricing and other relevant information based on IP but if that's not your area (and fair enough as that can get a bit complicated) then the redirection you're doing now to just get them to the right site is the logical option. I'd just block the .com in your robots and put the noindex,nofollow meta in there for good measure and start working on some good unique content and if you won't have time for that - just enjoy your UK rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content For Product Alternative listing
Hi I have a tricky one here. cloudswave is a directory of products and we are launching new pages called Alternatives to Product X This page displays 10 products that are an alternative to product X (Page A) Lets say now you want to have the alternatives to a similar product within the same industry, product Y (Page B), you will have 10 product alternatives, but this page will be almost identical to Page A as the products are in similar and in the same industry. Maybe one to two products will differ in the 2 listings. Now even SEO tags are different, aren't those two pages considered duplicate content? What are your suggestions to avoid this problem? thank you guys
Intermediate & Advanced SEO | | RSedrati0 -
Duplicate content issues from mirror subdomain : facebook.domianname.com
Hey Guys,
Intermediate & Advanced SEO | | b2bmarketer
Need your suggestions.
I have got a website that has duplicate content issue.
a sub-domain called facebook.asherstrategies .com comes from no where and is getting indexed.
Website Link : asherstrategies .com
subdomain link: facebook.asherstrategies .com This sub domain is actually a mirror of the website and i have no idea how is is created.
trying to resolve the issue but could not find the clue.0 -
How should I manage duplicate content caused by a guided navigation for my e-commerce site?
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
Intermediate & Advanced SEO | | FireMountainGems0 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Finding Duplicate Content Spanning more than one Site?
Hi forum, SEOMoz's crawler identifies duplicate content within your own site, which is great. How can I compare my site to another site to see if they share "duplicate content?" Thanks!
Intermediate & Advanced SEO | | Travis-W0 -
Duplicate content mess
One website I'm working with keeps a HTML archive of content from various magazines they publish. Some articles were repeated across different magazines, sometimes up to 5 times. These articles were also used as content elsewhere on the same website, resulting in up to 10 duplicates of the same article on one website. With regards to the 5 that are duplicates but not contained in the magazine, I can delete (resulting in 404) all but the highest value of each (most don't have any external links). There are hundreds of occurrences of this and it seems unfeasible to 301 or noindex them. After seeing how their system works I can canonical the remaining duplicate that isn't contained in the magazine to the corresponding original magazine version - but I can't canonical any of the other versions in the magazines to the original. I can't delete the other duplicates as they're part of the content of a particular issue of a magazine. The best thing I can think of doing is adding a link in the magazine duplicates to the original article, something along the lines of "This article originally appeared in...", though I get the impression the client wouldn't want to reveal that they used to share so much content across different magazines. The duplicate pages across the different magazines do differ slightly as a result of the different Contents menu for each magazine. Do you think it's a case of what I'm doing will be better than how it was, or is there something further I can do? Is adding the links enough? Thanks. 🙂
Intermediate & Advanced SEO | | Alex-Harford0 -
What constitutes duplicate content?
I have a website that lists various events. There is one particular event at a local swimming pool that occurs every few months -- for example, once in December 2011 and again in March 2012. It will probably happen again sometime in the future too. Each event has its own 'event' page, which includes a description of the event and other details. In the example above the only thing that changes is the date of the event, which is in an H2 tag. I'm getting this as an error in SEO Moz Pro as duplicate content. I could combine these pages, since the vast majority of the content is duplicate, but this will be a lot of work. Any suggestions on a strategy for handling this problem?
Intermediate & Advanced SEO | | ChatterBlock0 -
Google consolidating link juice on duplicate content pages
I've observed some strange findings on a website I am diagnosing and it has led me to a possible theory that seems to fly in the face of a lot of thinking: My theory is:
Intermediate & Advanced SEO | | James77
When google see's several duplicate content pages on a website, and decides to just show one version of the page, it at the same time agrigates the link juice pointing to all the duplicate pages, and ranks the 1 duplicate content page it decides to show as if all the link juice pointing to the duplicate versions were pointing to the 1 version. EG
Link X -> Duplicate Page A
Link Y -> Duplicate Page B Google decides Duplicate Page A is the one that is most important and applies the following formula to decide its rank. Link X + Link Y (Minus some dampening factor) -> Page A I came up with the idea after I seem to have reverse engineered this - IE the website I was trying to sort out for a client had this duplicate content, issue, so we decided to put unique content on Page A and Page B (not just one page like this but many). Bizarrely after about a week, all the Page A's dropped in rankings - indicating a possibility that the old link consolidation, may have been re-correctly associated with the two pages, so now Page A would only be getting Link Value X. Has anyone got any test/analysis to support or refute this??0