Impact of Japanese .jp site duplicate content?
-
Our main website is at http://www.traxnyc.com and we just launched a Japanese version of the site at http://www.traxnyc.jp domain. However all the images used on the .jp site are linked from the .com site. Would this hurt me in Google at all for hotlinking images?
Also there is quite a bit of duplicate content on the .jp site at the moment: only a few things have been translated to Japanese and for the most part the layouts and words are exactly the same (in English). Would this hurt my Google rankings in the US at all?
Thanks for all your help.
-
You will face ranking issue in your .jp domain because you have duplicate content on this site as well as you are using the image from .com version. I don't think your US site ranking get hurt from because it has unique content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
International Sites and Duplicate Content
Hello, I am working on a project where have some doubts regarding the structure of international sites and multi languages.Website is in the fashion industry. I think is a common problem for this industry. Website is translated in 5 languages and sell in 21 countries. As you can imagine this create a huge number of urls, so much that with ScreamingFrog I cant even complete the crawling. Perhaps the UK site is visible in all those versions http://www.MyDomain.com/en/GB/ http://www.MyDomain.com/it/GB/ http://www.MyDomain.com/fr/GB/ http://www.MyDomain.com/de/GB/ http://www.MyDomain.com/es/GB/ Obviously for SEO only the first version is important One other example, the French site is available in 5 languages and again... http://www.MyDomain.com/fr/FR/ http://www.MyDomain.com/en/FR/ http://www.MyDomain.com/it/FR/ http://www.MyDomain.com/de/FR/ http://www.MyDomain.com/es/FR/ And so on...this is creating 3 issues mainly: Endless crawling - with crawlers not focusing on most important pages Duplication of content Wrong GEO urls ranking in Google I have already implemented href lang but didn't noticed any improvements. Therefore my question is Should I exclude with "robots.txt" and "no index" the non appropriate targeting? Perhaps for UK leave crawable just English version i.e. http://www.MyDomain.com/en/GB/, for France just the French version http://www.MyDomain.com/fr/FR/ and so on What I would like to get doing this is to have the crawlers more focused on the important SEO pages, avoid content duplication and wrong urls rankings on local Google Please comment
International SEO | | guidoampollini0 -
How to best set up international XML site map?
Hi everyone, I've been searching about a problem, but haven't been able to find an answer. We would like to generate a XML site map for an international web shop. This shop has one domain for Dutch visitors (.nl) and another domain for visitors of other countries (Germany, France, Belgium etc.) (.com). The website on the 2 domains looks the same, has the same template and same pages, but as it is targeted to other countries, the pages are in different languages and the urls are also in different languages (see example below for a category bags). Example Netherlands:
International SEO | | DocdataCommerce
Dutch domain: www.client.nl
Example Dutch bags category page: www.client.nl/tassen Example France:
International domain: www.client.com
Example French bags category page: www.client.com/sacs When a visitor is on the Dutch domain (.nl) which shows the Dutch content, he can switch country to for example France in the country switch and then gets redirected to the other, international .com domain. Also the other way round. Now we want to generate a XML sitemap for these 2 domains. As it is the same site, but on 2 domains, development wants to make 1 sitemap, where we take the Dutch version with Dutch domain as basis and in the alternates we specify the other language versions on the other domain (see example below). <loc>http://www.client.nl/tassen</loc>
<xhtml:link<br>rel="alternate"
hreflang="fr"
href="http://www.client.com/sacs"
/></xhtml:link<br> Is this the best way to do this? Or would we need to make 2 site maps, as it are 2 domains?0 -
Homepage URL for multi-language site
Hi, We are setting up a new site, and currently considering the URL and folder structure of the site. We will have 2-3 different language versions, and we have decided to use sub folders for this. My question is regarding the homepage URL. We want the English language site (en) to be the default one, from where you can then change the language. Should I have a folder for each of the language versions (as described below)? www.mydomain.com/en
International SEO | | Awaraman
(this would be the default page where everyone would always come if they type www.mydomain.com to webrowser) www,mydomain.com/ru www.mydomain.com/es Or, would it be better for SEO to have www.mydomain.com as the default URL where we would have the English version of the site, and then have two other folders (as below) where we would have the 2 other language versions: www,mydomain.com/ru www.mydomain.com/es Thank you in advance, BR Sam0 -
Do you think the SEs would see this as duplicate content?
Hi Mozzers! I have a U.S. website and a Chinese version of that U.S. website. The China site only gets direct and PPC traffic because the robots.txt file is disallowing the SEs from crawling it. Question: If I added English sku descriptions and English content to the China site (which is also on our U.S. site), will the SEs penalize us for duplicate content even though the robots.txt file doesn’t allow them to see it? I plan on translating the descriptions and content to Chinese at a later date, but wanted to ask if the above was an issue. Thanks Mozzers!
International SEO | | JCorp0 -
Content in different languages
HI all, I need some advice about displaying content in different languages. Currently I 301 to the correct locale based on IP. e.g. German 301s from site.com to site.com/de En 301s from site.com to site.com/en Is this the best way or would it just be better to change the content based on browser and keep the URL the same? I have href="/hr" hreflang="hr" rel="alternate" /> tags implemented for all locales on site Thanks
International SEO | | Sayers0 -
How to replace my .co.uk site with my .com site in the US Google results
My customer and I are based in the UK. My customer's site, www.blindbolt.co.uk has been around for years. Last year we launched their American site, www.blindboltusa.com. Searching on google.com (tested both via proxy and using the gl=us querystring trick), a search for blind bolt on the US Google returns our www.blindbolt.co.uk site. We would like it to show our www.blindboltusa.com website in US searches. Webmaster tools has the Geographic Target set correctly for each site. Does anyone have any ideas or suggestions please? Thanks.
International SEO | | OffSightIT0 -
Site structure for multi-lingual hotel website (subfolder names)
Hi there superMozers! I´ve read a quite a few questions about multi-lingual sites but none answered my doubt / idea, so here it is: I´m re-designing an old website for a hotel in 4 different languages which are all** hosted on the same .com domain** as follows: example.com/english/ for english example.com/espanol/ for **spanish ** example.com/francais/ for french example.com/portugues/ for portuguese While doing keyword search, I have noticed that many travel agencies separate geographical areas by folders, therefor an **agency pomoting beach hotels in South America **will have a structure as follows: travelagency.com/argentina-beach-hotels/ travelagency.com/peru-beach-hotels/ and they list hotels in each folder, therefor benefiting from those keywords to rank ahead of many independent hotels sites from those areas. What **I would like to **do -rather than just naming those folders with the traditional /en/ for english or /fr/ for french etc- is take advantage of this extra language subfolder to_´include´_ important keywords in the name of the subfolders in the following way (supposing the we have a beach hotel in Argentina): example.com/argentina-beach-hotel/ for english example.com/hotel-playa-argentina/ for **spanish ** example.com/hotel-plage-argentine/ for french example.com/hotel-praia-argentina/ for portuguese Note that the same keywords are used in the name of the folder, but translated into the language the subfolders are. In order to make things clear for the search engines I would specify the language in the html for each page. My doubt is whether google or other search engines may consider this as ´stuffing´ although most travel agencies do it in their site structure. Any Mozers have experience with this, any idea on how search engines may react, or if they could penalise the site? Thanks in advance!
International SEO | | underground0 -
Multi-lingual SEO: Country-specific TLD's, or migration to a huge .com site?
Dear SEOmoz team, I’m an in-house SEO looking after a number of sites in a competitive vertical. Right now we have our core example.com site translated into over thirty different languages, with each one sitting on its own country-specific TLD (so example.de, example.jp, example.es, example.co.kr etc…). Though we’re using a template system so that changes to the .com domain propagate across all languages, over the years things have become more complex in quite a few areas. For example, the level of analytics script hacks and filters we have created in order to channel users through to each language profile is now bordering on the epic. For a number of reasons we’ve recently been discussing the cost/benefit of migrating all of these languages into the single example.com domain. On first look this would appear to simplify things greatly; however I’m nervous about what effect this would have on our organic SE traffic. All these separate sites have cumulatively received years of on/off-site work, and even if we went through the process of setting up page-for-page redirects to their new home on example.com, I would hate to lose all this hard-work (and business) if we saw our rankings tank as a result of the move. So I guess the question is, for an international business such as ours, which is the optimal site structure in the eyes of the search engines; Local sites on local TLD’s, or one mammoth site with language identifiers in the URL path (or subdomains)? Is Google still so reliant on TLD for geo targeting search results, or is it less of a factor in today’s search engine environment? Cheers!
International SEO | | linklater0