Best practice for multi-language site?
-
Recently our company is going to expand our site from just english to multi-language, including english, french, german, japanese, and chinese.
I deeply understand a solid and feasible plan is pretty important, so I want to ask you mozzers for help before we taking action!
Our site is a business site which sells eBook software, for the product pages, the ranks are taken by famous software download sites like cnet, softonic, etc. So the main source of our organic traffic is the guide post, long-tail keywords.
We are going to manually translate the product pages and guide post pages which targeting on important keywords into other languages. Not the entire english site.
So my primary question is: should I use the sub-domain or sub-category to build the non-english pages? "www.example.com/fr/" or "fr.example.com"?
The second question: As we are going to manually translate the entire pages into other languages, should I use the "rel=alternate hreflang=x" tags? Because Google's official guideline says if we only translate the navigations or just part of the content, we should use this tag.
And what's your tips for building a multi-language site? Please let me know them as much as possible
Thanks!
-
Yes that's a good point. So if you are just translating content but not targeting it to specific countries only, you can use href lang to specify the language, without specifying the country. E.g.
would specify French Canadian content but
would just state that it is for all French speakers.
In this case, you wouldn't need different top level domains to target each country, which is probably more than what you need!
Hope that helps.
-
There is a difference in targeting by country and targeting by language. What I am seeing here is that you are translating only. You won't be distinguishing Canadian traffic from France traffic right? Just have your content in French?
-
I'm not sure of the definitive answer to your question re. subfolders / subdirectories but have you considered using ccTLDs? As this is still the clearest way to tell Google what country you are targeting. Obviously there are logistical points to consider on this.
See what everyone else says but there are some great articles here:
http://www.seerinteractive.com/blog/international-seo-strategy-guide http://www.seomoz.org/blog/international-seo-where-to-host-and-how-to-target-whiteboard-friday http://www.seomoz.org/blog/international-seo-dropping-the-information-dust
Re. href lang, yes I think you should implement them if you are keeping the info all on the same domain but you don't have to do it on a page by page basis - you can make a sitemap. More info and free tool to generate them here:
http://www.themediaflow.com/2012/08/an-international-seo-implementation-tale-sitemaps-relalternate-hreflangx/
http://www.themediaflow.com/resources/tools/href-lang-tool/Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Setting up international site subdirectories in GSC as separate properties for better geotargeting?
My client has an international website with a subdirectory structure for each country and language version - eg. /en-US. At present, there is a single property set up for the domain in Google Search Console but there are currently various geotargeting issues I’m trying to correct with hreflang tags. My question is, is it still recommended practise and helpful to add each international subdirectory to Google Search Console as an individual property to help with correct language and region tagging? I know there used to be properly sets for this but haven’t found any up to date guidance on whether setting up all the different versions as their own properties might help with targeting. Many thanks in advance!
International SEO | | MMcCalden0 -
Any practical examples of ranking 1 domain in multiple countries?
Hi, I've done a fair amount of research on international SEO including here on MOZ but was hoping some fellow Mozzers might have some practical examples of how they have got 1 domain to rank in multiple countries, ideally US & UK. Im possibly looking at getting a high authority domain which ranks great on US into the UK engines. I want to keep to the 1 domain to benefit from the high authority and for logistical reasons. Thanks in advance, Andy
International SEO | | AndyMacLean0 -
Naming URL for Russian version of the site
Hi, Our site has two languages: English and Russian. My question is that should I use Cyrillic letters in the URL structure and file naming of the Russian version of the site, as Russian users are searching for information by using Russian words not English words? Thanks in advance, Sam
International SEO | | Awaraman0 -
Geotarget subfolders with the same language or get rid of duplicates but lose option to geotarget?
Hi, we have a domain that is aimed to cover LatAm region. Currently, the homepage contains country selector for ~20 countries. 95% of them hold content in Spanish. We have only homepages for each regions as separate subfolders, i.e.
International SEO | | eset
www.maindomain.com/co
www.maindomain.com/cl
www.maindomain.com/br
etc. but once the user clicks on menu item he is taken back to main domain subpages, i.e. www.maindomain.com/comprar My struggle is to decide whether it is better to: A) copy all content for each subfolder, which will create huge amount of duplicates (there are no resources to create unique content and it is even impossible taking into account nature of the product - mostly tech.specs, etc.) and implement hreflang sitemaps and configure GWT to target each country with its own Spanish content (the same for each country) OR B) remove all local subfolders and keep only main domain in Spanish that will serve all countries within the region. With this option, we will get rid of duplicates but also lose option to geotarget. So, my questions is which option will do less harm, or if there is any other approach that comes to your minds. I consulted with two agencies but still haven't got clear answer. Thanks a lot for your help!0 -
Multi-Country Hosting
Hi all
International SEO | | yousayjump
I have read a few post that are similar to mine on this forum but still wanted to ask my specific question. We have a .com domain hosted on a dedicated server in the UK for our main website which takes online bookings and payments. We have purchased the following TLD's of our domain name:
.it, .es, .ch, .fr, .es, .at We essentially want to create a satellite site on each domain where the idea is that users in those countries will find our satellite sites via organic search engine results when searching for the information we sell. Due to the nature of our infrastructure, each satellite site will connect to the mysql database on the UK site to drawn down its text content. This is where the main CMS is located. I understand the following are important from an SEO point of view for our satellite sites: Hosting in the country we wish to target
Geographical targeting via WMT
Local addresses and telephone numbers on the site
Correct lang=" " tag in the source code
Content presented in the correct language (goes without saying really) Can you advise on any other considerations? Many Thanks Kris0 -
Multilingual Ecommerce Product Pages Best Practices
Hi Mozzers, We have a marketplace with 20k+ products, most of which are written in English. At the same time we support several different languages. This changes the chrome of the site (nav, footer, help text, buttons, everything we control) but leaves all the products in their original language. This resulted in all kinds of duplicate content (pages, titles, descriptions) being detected by SEOMoz and GWT. After doing some research we implemented the on page rel="alternate" hreflang="x", seeing as our situation almost perfectly matched the first use case listed by Google on this page http://support.google.com/webmasters/bin/answer.py?hl=en&answer=189077. This ended up not helping at all. Google still reports duplicate titles and descriptions for thousands of products, months after setting this up. We are thinking about changing to the sitemap implementation rel="alternate" hreflang="X", but are not sure if this will work either. Other options we have considered include noindex or blocks with robots.txt when the product language is not the same as the site language. That way the feature is still open to users while removing the duplicate pages for Google. So I'm asking for input on best practice for getting Google to correctly recognize one product, with 6 different language views of that same product. Can anyone help? Examples: (Site in English, Product in English) http://website.com/products/product-72 (Site in Spanish, Product in English) http://website.com/es/products/product-72 (Site in German, Product in English) http://website.com/de/products/product-72 etc...
International SEO | | sedwards0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0 -
What is the best SEO site structure for multi country targeting?
Hi There, We are an online retailer with four (and soon to be five) distinct geographic target markets (we have physical operations in both the UK and New Zealand). We currently target these markets like this: United Kingdom (www.natureshop.co.uk) New Zealand (www.natureshop.co.nz) Australia (www.natureshop.com/au) - using a google web master tools geo targeted folder United States (www.natureshop.com) - using google web master tools geo targeted domain Germany (www.natureshop.de) - in german and yet to be launched as full site We have various issues we want to address. The key one is this: our www.natureshop.co.uk website was adversely affected by the panda update on April 12. We had some external seo firms work on this site for us and unfortunately the links they gained for us were very low quality, from sometimes spammy sites and also "keyword" packed with very littlle anchor text variation. Our other websites (the .co.nz and .com) moved up after the updates so I can only assume our external seo consultants were responsible for this. I have since managed to get them to remove around 70% of these links and we have bought all seo efforts back in house again. I have also worked to improve the quality of our content on this site and I have 404'ed the six worst affected pages (the ones that had far too many single phrase anchor text links coming into them). We have however not budged much in our rankings (we have made some small gains but not a lot). Our other weakness's are not the fastest page load times and some "thin" content. We are on the cusp (around 4 weeks away) of deploying a brand new platform using asp.net MVP with N2 and this looks like it will address our page load speed issues. We also have been working hard on our content building and I believe we will address that as well with this release. Sorry for the long build up, however I felt some background was needed to get to my questions. My questions are: Do you think we are best to proceed with trying to get our www.natureshop.co.uk website out of the panda trap or should we consider deploying a new version of the site on www.natureshop.com/uk/ (geo targeted to the UK)? If we are to do this should we do the same for New Zealand and Germany and redirect the existing domains to the new geo targeted folders? If we do this should we redirect the natureshop.co.uk pages to the new www.natureshop.com/uk/ pages or will this simply pass on the panda "penalty". Will this model build stronger authority on the .com domain that benefit all of the geo targeted sub folders or does it not work this way? Finally can we deploy the same pages and content on the different geo targeted sub folders (with some subtle regional variations of spelling and language) or will this result in a duplicate content penalty? Thank you very much in advance to all of you and I apologise for the length and complexity of the question. Kind Regards
International SEO | | ConradC
Conrad Cranfield
Founder: Nature Shop Ltd0