Geographic site clones and duplicate content penalties
-
We sell wedding garters, niche I know!
We have a site (weddinggarterco.com) that ranks very well in the UK and sell a lot to the USA despite it's rudimentary currency functions (Shopify makes US customers checkout in £gbp; not helpful to conversions).
To improve this I built a clone (theweddinggarterco.com) and have faked a kind of location selector top right. Needless to say a lot of content on this site is VERY similar to the UK version. My questions are...
1. Is this likely to stop me ranking the USA site?
2. Is this likely to harm my UK rankings?
Any thoughts very welcome! Thanks. Mat
-
Well I maybe biased because this is what I wanted to hear but personally I think spot on, particularly the Kissmetrics article from a later response. I have set geo-targeting already and will also sort the HREFLANG tags.
I plan to leave both sites on .com domains - In the UK .com's are just as 'normal' as .co.uk's. All content has been updated to US english and with specific relevant info so I think it's just down to the usual link building and adding content to get it to rank.
I genuinely appreciate all the responses, fantastically useful, thank you!
Mat
-
Hi Dave,
Because it's a bot that's examining the site you need the hreflang & geo-targeting. Algorithms are not perfect, and mistakes do happen, but I am convinced that on the long run you win by staying close to the guidelines (and certainly by putting the benefits of your visitors/customers first).
Personally, I think this whole duplicate content issue is a bit overrated (and I am not only one - check this post on Kissmetrics). In most cases, when finding duplicate content Google will just pick one of the sites to show in the results, and not show the others, unless the duplicate content has a clear intent of spamming. Panda is mainly about thin and/or low quality content, or content duplicated from other sites (without hreflang/geotargeting etc) so I would consider the risk in this case rather low.
There was a discussion on Google product forums which is quite similar to this one (Burberry had a massive traffic drop on it's US site) - and the answer from JohnMu from Google was quite similar to the answer I gave: use geo targeting & hreflang.
rgds,
Dirk
-
I do agree that by the guidelines taken verbatim you could make a good case. My concern is that it's not some guy at Google sitting down and judging sites and asking, "Does this violate the guidelines?" it's a bot and as I'm sure everyone here can attest ... Pandas and Penguin aren't perfect. One can just ask Barry Schwartz of the very credible SE Roundtable about getting hit with a Panda false positive on content issues and about the cost in traffic it causes. Or you can read his post on it here.
Or maybe I'm just paranoid. That could well be.
-
Hi,
I tend to disagree with the answers above. If you check the "official" Google point of view it states: "This (=duplicate content) is generally not a problem as long as the content is for different users in different countries"
So - you should make it obvious that the content is for different users in different countries.
1. Use Webmaster Tools to set the target geography:
- set weddinggarterco.com to UK
- set theweddinggarterco.com to US
You could also consider to put weddinggarterco.com on weddinggarter.co.uk and redirect weddinggarterco.com to the .co.uk version (currently the redirection is the other way round). This way you could leave theweddinggarterco.com without specific geo-target (if you also would target countries like AU)
2. Use the HREFLANG on both sites (on all the pages). You can find a generator here and a tool to check if it's properly implemented here. Other interesting articles on HREFLANG can be found here and here
3. It seems you already adapted a few pages to be more tailored to the US market (shipping, prices) - not sure if you already put the content in US english.
4. I imagine the sites are hosted in the UK. Make sure that the .com version loads fast enough - check both versions on webpagetest.org with US & UK ip's and see if there is a difference in load times. If you're not using it already - consider the use of a CDN
If you do all of the above, you normally should be fine. Hope this helps,
Dirk
-
Hi there.
You can face duplicate content issue. What you can do is to use hreflang or/and canonical links. This would make it all right and would assure that your rankings wouldn't drop.
Cheers.
-
There are always exception to rules but for safety I would highly recommend blocking the .com site until you can get some real unique content on it. It does stand a high chance of taking it's own devaluation (almost certain) and may impact the .co.uk site (and really ... why risk it).
If the scenario was mind I'd have simply built in customized pricing and other relevant information based on IP but if that's not your area (and fair enough as that can get a bit complicated) then the redirection you're doing now to just get them to the right site is the logical option. I'd just block the .com in your robots and put the noindex,nofollow meta in there for good measure and start working on some good unique content and if you won't have time for that - just enjoy your UK rankings.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content in Shopify - subsequent pages in collections
Hello everyone! I hope an expert in this community can help me verify the canonical codes I'll add to our store is correct. Currently, in our Shopify store, the subsequent pages in the collections are not indexed by Google, however the canonical URL on these pages aren't pointing to the main collection page (page 1), e.g. The canonical URL of page 2, page 3 etc are used as canonical URLs instead of the first page of the collections. I have the canonical codes attached below, it would be much appreciated if an expert can urgently verify these codes are good to use and will solve the above issues? Thanks so much for your kind help in advance!! -----------------CODES BELOW--------------- <title><br /> {{ page_title }}{% if current_tags %} – tagged "{{ current_tags | join: ', ' }}"{% endif %}{% if current_page != 1 %} – Page {{ current_page }}{% endif %}{% unless page_title contains shop.name %} – {{ shop.name }}{% endunless %}<br /></title>
Intermediate & Advanced SEO | | ycnetpro101
{% if page_description %} {% endif %} {% if current_page != 1 %} {% else %} {% endif %}
{% if template == 'collection' %}{% if collection %}
{% if current_page == 1 %} {% endif %}
{% if template == 'product' %}{% if product %} {% endif %}
{% if template == 'collection' %}{% if collection %} {% endif %}0 -
Duplicate Content
Let's say a blog is publishing original content. Now let's say a second blog steals that original content via bot and publishes it as it's own. Now further assume the original blog doesn't notice this for several years. How much damage could this do to blog A for Google results? Any opinions?
Intermediate & Advanced SEO | | CYNOT0 -
How to Set Up Canonical Tags to Eliminate Duplicate Content Error
Google Webmaster Tools under HTML improvements is showing duplicate meta descriptions for 2 similar pages. The 2 pages are for building address. The URL has several pages because there are multiple property listings for this building. The URLs in question are: www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan/page/3 www.metro-manhattan.com/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan How do I correct this error using canonical tags? Do I enter the URL of the 1<sup>st</sup> page under “Canonical URL” under “Advanced” to show Google that these pages are one and the same? If so, do I enter the entire URL into this field (www.metro-manhattan.com /601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan) or an abbreviated version (/601-west-26th-street-starrett-lehigh-building-contains-executive-office-space-manhattan)? Please see attached images. Thanks!! Alan rUspIzk 34aSQ7k
Intermediate & Advanced SEO | | Kingalan10 -
Ticket Industry E-commerce Duplicate Content Question
Hey everyone, How goes it? I've got a bunch of duplicate content issues flagged in my Moz report and I can't figure out why. We're a ticketing site and the pages that are causing the duplicate content are for events that we no longer offer tickets to, but that we will eventually offer tickets to again. Check these examples out: http://www.charged.fm/mlb-all-star-game-tickets http://www.charged.fm/fiba-world-championship-tickets I realize the content is thin and that these pages basically the same, but I understood that since the Title tags are different that they shouldn't appear to the Goog as duplicate content. Could anyone offer me some insight or solutions to this? Should they be noindexed while the events aren't active? Thanks
Intermediate & Advanced SEO | | keL.A.xT.o1 -
Tabs and duplicate content?
We own this site http://www.discountstickerprinting.co.uk/ and just a little concerned as I right clicked open in new tab on the tab content section and it went to a new page For example if you right click on the price tab and click open in new tab you will end up with the url
Intermediate & Advanced SEO | | BobAnderson
http://www.discountstickerprinting.co.uk/#tabThree Does this mean that our content is being duplicated onto another page? If so what should I do?0 -
Adding a huge new product range to eCommerce site and worried about Duplicate Content
Hey all, We currently run a large eCommerce site that has around 5000 pages of content and ranks quite strongly for a lot of key search terms. We have just recently finalised a business agreement to incorporate a new product line that compliments our existing catalogue, but I am concerned about dumping this huge amount of content (that is sourced via an API) onto our site and the effect it might have dragging us down for our existing type of product. In regards to the best way to handle it, we are looking at a few ideas and wondered what SEOMoz thought was the best. Some approaches we are tossing around include: making each page point to the original API the data comes from as the canonical source (not ideal as I don't want to pass link juice from our site to theirs) adding "noindex" to all the new pages so Google simply ignores them and hoping we get side sales onto our existing product instead of trying to rank as the new range is highly competitive (again not ideal as we would like to get whatever organic traffic we can) manually rewriting each and every new product page's descriptions, tags etc. (a huge undertaking in terms of working hours given it will be around 4,400 new items added to our catalogue). Currently the industry standard seems to just be to pull the text from the API and leave it, but doing exact text searches shows that there are literally hundreds of other sites using the exact same duplicate content... I would like to persuade higher management to invest the time into rewriting each individual page but it would be a huge task and be difficult to maintain as changes continually happen. Sorry for the wordy post but this is a big decision that potentially has drastic effects on our business as the vast majority of it is conducted online. Thanks in advance for any helpful replies!
Intermediate & Advanced SEO | | ExperienceOz0 -
Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page. The URLs are... mobile:
Intermediate & Advanced SEO | | grayloon
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!0 -
Duplicate Content Through Sorting
I have a website that sells images. When you search you're given a page like this: http://www.andertoons.com/search-cartoons/santa/ I also give users the option to resort results by date, views and rating like this: http://www.andertoons.com/search-cartoons/santa/byrating/ I've seen in SEOmoz that Google might see these as duplicate content, but it's a feature I think is useful. How should I address this?
Intermediate & Advanced SEO | | andertoons0