Is the same content posted under different international TLDs a problem?
-
Dear all,
I have a site which owns .be, .cn, .biz, .com.mx, .de, .us, .info, .net, .org and all run from the same server and have no difference in content i.e. .com.mx/our-services is the same as .com/our-services
Google webmaster help created a video that said multiple international TLDs, same content 'should be ok' - http://www.youtube.com/watch?v=Ets7nHOV1Yo - however, I would like confirmation from practitioners!
What is the best practice in this case? Considering none of the content is customised, should I create root level redirects to our .com, or leave as is?
Thanks!
Christian
-
Here is my two cents and take it for what it's worth. We took a .com site written in English and hosted in the U.S., duplicated the website content but redesigned the website. The duplicated content and new design was then hosted in Germany hoping to target UK and English searches in Europe, the website was a complete flop. We then took the duplicate content and had a professional translation service convert the exact content on the U.S. website to German and updated the new site and it started performing very well. We took the same approach in France and our other target markets; however, it didn't work in the UK.
-
Most responses I have seen for similiar questions is to redirect the country based TLDs to folders based on the country name, but that is usually if you have translations available.
If the content is exactly the same and not translated I would probably just redirect to the .com domain. This TLD gets the most respect, and most users know it regardless of country. Also if you use 301's it will help consolidate your link popularity under the one domain name for people who don't actually check your link and just link to you with example.de instead of example.com.
This way once you do put up translations (Which you should be doing if you get lots of international users) then you can switch the redirect to the appropriate folder at that point and still have your links consolidated.
Just my 2 cents
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regional sites built on different platforms - will this solution for international targeting work?
We are working with our dev team on a few upcoming user stories to improve store.hp.com. We came across a question which isn’t clear in the international targeting documentation. Within http://store.hp.com, we have a number of regional stores, but those are often built on separate platforms. Therefore a story developed on the US infrastructure doesn’t carry over to Canada and so forth. The Canada Store is managed by a different team, so that story needs to get scoped, prioritized, etc. independently. In regards to helping Google understand page equivalence, will Google accept the page relationship if we include hreflang tags exclusively in the sitemap for the US site and exclusively as page-level markup for Canada site? For example: http://store.hp.com/CanadaStore (hreflang notation at page-level): http://store.hp.com/us/en" /> http://store.hp.com/CanadaStore" /> http://store.hp.com/us/en" /> http://store.hp.com/us/en (hreflang notation within sitemap file): <loc>http://store.hp.com/us/en</loc> rel="alternate" hreflang="en-ca" href=" http://store.hp.com/CanadaStore" /> rel="alternate" hreflang="en-us" href="http://store.hp.com/us/en" /> Appreciate the help anyone can give! Zach
Technical SEO | | ZachKline0 -
Duplicate content problem
Hi there, I have a couple of related questions about the crawl report finding duplicate content: We have a number of pages that feature mostly media - just a picture or just a slideshow - with very little text. These pages are rarely viewed and they are identified as duplicate content even though the pages are indeed unique to the user. Does anyone have an opinion about whether or not we'd be better off to just remove them since we do not have the time to add enough text at this point to make them unique to the bots? The other question is we have a redirect for any 404 on our site that follows the pattern immigroup.com/news/* - the redirect merely sends the user back to immigroup.com/news. However, Moz's crawl seems to be reading this as duplicate content as well. I'm not sure why that is, but is there anything we can do about this? These pages do not exist, they just come from someone typing in the wrong url or from someone clicking on a bad link. But we want the traffic - after all the users are landing on a page that has a lot of content. Any help would be great! Thanks very much! George
Technical SEO | | canadageorge0 -
Duplicate content
I have two page, where the second makes a duplicate content from the first Example:www.mysite.com/mypagewww.mysite.com/mysecondpageIf i insert still making duplicate content?Best regards,Wendel
Technical SEO | | peopleinteractive0 -
Google inconsistent in display of meta content vs page content?
Our e-comm site includes more than 250 brand pages - lrg image, some fluffy text, maybe a video, links to categories for that brand, etc. In many cases, Google publishes our page title and description in their search results. However, in some cases, Google instead publishes our H1 and the aforementioned fluffy page content. We want our page content to read well, be descriptive of the brand and appropriate for the audience. We want our meta titles and descriptions brief and likely to attract CTR from qualified shoppers. I'm finding this difficult to manage when Google pulls from two different areas inconsistently. So my question... Is there a way to ensure Google only utilizes our title/desc for our listings?
Technical SEO | | websurfer0 -
How critical is Duplicate content warnings?
Hi, So I have created my first campaign here and I have to say the tools, user interface and the on-page optimization, everything is useful and I am happy with SEOMOZ. However, the crawl report returned thousands of errors and most of them are duplicate content warnings. As we use Drupal as our CMS, the duplicate content is caused by Drupal's pagination problems. Let's say there is a page called "/top5list" , the crawler decided /top5list?page=1" to be duplicate of "/top5list". There is no real solution for pagination problems in Drupal (as far as I know). I don't have any warnings in Google's webmaster tools regarding this and my sitemap I submitted to Google doesn't include those problematic deep pages. (that are detected as duplicate content by SEOMOZ crawler) So my question is, should I be worried about the thousands of error messages in crawler diagnostics? any ideas appreciated
Technical SEO | | Gamer070 -
Problem wth Crawling
Hello, I have a website http://digitaldiscovery.eu here in SEOmoz. Its strange since the last week SEOmoz is crawling only one page! And before it was crwaling all the pages. Whats happening? Help SEOmoz! :))
Technical SEO | | PedroM0 -
Mapping Internal Links (Which are causing duplicate content)
I'm working on a site that is throwing off a -lot- of duplicate content for its size. A lot of it appears to be coming from bad links within the site itself, which were caused when it was ported over from static HTML to Expression Engine (by someone else). I'm finding EE an incredibly frustrating platform to work with, as it appears to be directing 404's on sub-pages to the page directly above that subpage, without actually providing a 404 response. It's very weird. Does anyone have any recommendations on software to clearly map out a site's internal link structure so that I can find what bad links are pointing to the wrong pages?
Technical SEO | | BedeFahey0 -
Duplicate content connundrum
Hey Mozzers- I have a tricky situation with one of my clients. They're a reputable organization and have been mentioned in several major news articles. They want to create a Press page on their site with links to each article, but they want viewers to remain within the site and not be redirected to the press sites themselves. The other issue is some of the articles have been removed from the original press sites where they were first posted. I want to avoid duplicate content issues, but I don't see how to repost the articles within the client's site. I figure I have 3 options: 1. create PDFs (w/SEO-friendly URLs) with the articles embedded in them that open in a new window. 2. Post an image with screenshot of article on a unique URL w/brief content. 3. Copy and paste the article to a unique URL. If anyone has experience with this issue or any suggestions, I would greatly appreciate it. Jaime Brown
Technical SEO | | JamesBSEO0