I have a site that has 65 different versions of itself.
-
I've just started managing a site that serves over 50 different countries and the entire web enterprise is being flagged for duplicate content because there is so much of it. What's the best approach to stop this duplicate content, yet serve all of the countries we need to?
-
Yes sir, I agree it will be a "bit of an effort". Thank you both for some great guidance and if there's anybody else that has other solutions to these types of issues, I welcome your feedback as well.
-
It may be a bit of an effort but is it possible that you can work your way through the pages and make the content, titles and descrptions unique so that they don't get flagged as duplicate content.
This has the added advantage of having a large number of pages targeted at your various keyphrases whereas other apporaches involving 301 redirects or rel="nofollow" reduce the duplicate content issue but also reduce the number of pages on which to target keyphrases across all of these pages. If they are acorss 50 countries is there a local spin that can be put on the content so that all the relevant terms are targeted for in their regions but so that Google doesn't see 50+ versions of the same site.
-
Thank You Flatiron,
Yes the content is on different servers due to the different countries they serve as well as the languages. The client's US site is what I am working to improve and they currently have over 2,500 Duplicate title tags and Meta-Descriptions out there. Would modifying the robots.txt file to instruct the SE's to simply crawl the one main site and ignore the others be the best solution? My train of thought is going back to a previous case I had with a previous company where their product list pages were seen as duplicate pages due to the fact that each of the "sort" parameters were being recognized as duplicates by the SE's. We had to write an instruction to only crawl the first sorted results.
-
Hi Ken,
Is the content actually exactly the same but running on different domains? That will determine how to approach this issue. If all the the content is the same you can either utilize 301 redirects or rel=canonical tags to help the engines view the multiple sites as a single site and combine any link juice that's associated with each of the 50 sites. If the content isn't actually duplicitous then it or the page titles are extremely similar. In the long run I would recommend localizing your content so as to not only help from an SEO perspective but to also improve the user experience and hopefully the conversion rates as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best way to interlink 25 different language versions of a website?
I have a website which has 25 different language versions on 16 different domains. Hreflan are setup to point to different language versions. In the footer we have deeplinks to the 25 language versions. Site is not spammy but in small niche and many language versions have very few other external links. For some time this site had lost rankings for reasons that are unclear till now. I see that large international sites such as booking.com, tripadvisor, apple all use different approaches to interlink their language versions. Interestingly Tripadvisor is nowadays loading the links to their other language versions dynamically only upon click so that these links do not show up in source code, deviating from their former implementation of static deeplinks to all language versions. Matt Cutts mentioned back in 2013 “If you have 50 different sites, I wouldn’t link to all 50 sites down in the footer of your website, because that can start to look pretty spammy to users. Instead you might just link to no more than three or four or five down in the footer, that sort of thing, or have a link to a global page, and the global page can talk about all the different verions and country versions of your website.” But in their webmaster guidelines google recommends: "Consider cross-linking each language version of a page. That way, a French user who lands on the German version of your page can get to the right language version with a single click." I assume for SEO anyway these links have no value, but for user experience it would certainly be better to provide somewhere deeplinks to other language versions. Also the fact that language versions are on different domains and have few external backlinks may increase a bit the risk in our case. I guess in doubt I would prefer to be safe and load deeplinks only upon click same as tripadvisor. Any thoughts/suggestions on best interlinking in our specific case?
International SEO | | lcourse0 -
In the U.S., how can I stop the European version of my site from outranking the U.S. version?
I've got a site with two versions – a U.S. version and a European version. Users are directed to the appropriate version through a landing page that asks where they're located; both sites are on the same domain, except one is .com/us and the other is .com/eu. My issue is that for some keywords, the European version is outranking the U.S. version in Google's U.S. SERPs. Not only that, but when Google displays sitelinks in the U.S. SERPs, it's a combination of pages on the European site and the U.S. site. Does anyone know how I can stop the European site from outranking the U.S. site in the U.S.? Or how I can get Google to only display sitelinks for pages on the U.S. site in the U.S. SERPs? Thanks in advance for any light you can shed on this topic!
International SEO | | matt-145670 -
How to handle rel canonical on secondary TLD's - multi regional sites.
I currently have a .com domain which I am think of duplicating the content on to another tld, CO.UK (and regionalize some aspects like contact numbers etc). From my research, it seems that in gwt you must then indicate which country you wish to target, in the co.uk case the UK. My question is how should I handle rel canonical in the duplicated site. should it rel canonical back to the .com or the co.uk? Any other pointers would also be appreciated. Thx
International SEO | | dmccarthy0 -
Multi-lingual Site (Tags & XML SiteMap Question)
We have two sites that target users in two different countries in different languages in the following manner: Site 1 es.site1.com - Spanish version Site 2 site2.com/francais/.............. Navigation and content are translated into the foreign language from English What is the best way to let Google know about these multi-lingual pages: A. Add the rel="alternate" and hreflang= in the source code for the hunders of pages we have. B. Or is there a tool we can use to crawl and create XML site maps for different language pages. What do we need to do in the XML site map so that Google know that sitemap1.xml for example relates to Spanish as an example many thanks
International SEO | | CeeC-Blogger0 -
My site is not showing on google.com ?
My website is not showing at all on google.com searches under search terms. It is showing if i search for my domain (xyz.com) but not by keywords. Searching by keywords and search terms shows my website on yahoo , bing and non US oogle as google.co.uk or google.com.pk on first page. Would appreciate any help in trying to understand why this is happening? The website is medicare.md. if you search for term "medicare doctors PG county maryland" it is #1 in bing and yahoo but not even showing on google.com first TEN pages, although not banned. Interestingly if you do that search on google.co.pk it is #4. Quite Puzzuling !! Would appreciate any help or advice . Sherif Hassan
International SEO | | sherohass0 -
Reciprocal Links between my own sites ?
Is is ok to have Reciprocal Links between sites you really own ? We have a website that has been regionalized to 5 countries, using 5 different domains. The content is exclusive for the country but the keywords used might be similar. We have all the domains under the same Analytics account and all of them share the same Adsense code. Can I be penalized by Google for making reciprocal links between them ? Is something usefull for improving the SEO rank or I should avoid doing it ? Thanks in advance
International SEO | | martincad0 -
Adding content in different languages
I have a site which offers free printable greeting cards in English. I have noticed that many people find the site from other countries with keywords in foreign languages. The bounce rate from these countries is high since presumably they leave when they realize that all cards are in English only. I was thinking of providing some greeting cards in different languages. I was thinking of offering each language in a subfolder which is only have the cards in that language but the content will either be in English or an automatic translation per IP. I do not plan to translate the content into different languages only the cards. What is the best way to do this?
International SEO | | nicolebd0 -
What is the best SEO site structure for multi country targeting?
Hi There, We are an online retailer with four (and soon to be five) distinct geographic target markets (we have physical operations in both the UK and New Zealand). We currently target these markets like this: United Kingdom (www.natureshop.co.uk) New Zealand (www.natureshop.co.nz) Australia (www.natureshop.com/au) - using a google web master tools geo targeted folder United States (www.natureshop.com) - using google web master tools geo targeted domain Germany (www.natureshop.de) - in german and yet to be launched as full site We have various issues we want to address. The key one is this: our www.natureshop.co.uk website was adversely affected by the panda update on April 12. We had some external seo firms work on this site for us and unfortunately the links they gained for us were very low quality, from sometimes spammy sites and also "keyword" packed with very littlle anchor text variation. Our other websites (the .co.nz and .com) moved up after the updates so I can only assume our external seo consultants were responsible for this. I have since managed to get them to remove around 70% of these links and we have bought all seo efforts back in house again. I have also worked to improve the quality of our content on this site and I have 404'ed the six worst affected pages (the ones that had far too many single phrase anchor text links coming into them). We have however not budged much in our rankings (we have made some small gains but not a lot). Our other weakness's are not the fastest page load times and some "thin" content. We are on the cusp (around 4 weeks away) of deploying a brand new platform using asp.net MVP with N2 and this looks like it will address our page load speed issues. We also have been working hard on our content building and I believe we will address that as well with this release. Sorry for the long build up, however I felt some background was needed to get to my questions. My questions are: Do you think we are best to proceed with trying to get our www.natureshop.co.uk website out of the panda trap or should we consider deploying a new version of the site on www.natureshop.com/uk/ (geo targeted to the UK)? If we are to do this should we do the same for New Zealand and Germany and redirect the existing domains to the new geo targeted folders? If we do this should we redirect the natureshop.co.uk pages to the new www.natureshop.com/uk/ pages or will this simply pass on the panda "penalty". Will this model build stronger authority on the .com domain that benefit all of the geo targeted sub folders or does it not work this way? Finally can we deploy the same pages and content on the different geo targeted sub folders (with some subtle regional variations of spelling and language) or will this result in a duplicate content penalty? Thank you very much in advance to all of you and I apologise for the length and complexity of the question. Kind Regards
International SEO | | ConradC
Conrad Cranfield
Founder: Nature Shop Ltd0