Multilingual site - separate domain or all under the same umbrella
-
this has been asked before with not clear winner.
I am trying to sum up pros and cons of doing a multilingual site and sharing the same domain for all languages or breaking it into dedicated subdomains
e.g. as an example lets assume we are talking about a french property portal with an english version as well.
Assume most of the current incoming links and traffic is from France.
A) www.french-name.fr/fr/pageX for the french version
www.english-name.com/en/pageX for the english version
B) www.french-name.fr/fr/ for the french name (as is)
www.french-name.fr/en for the english version
the client currently follows approach A but is thinking to move towards B
we see the following pros and cons for B
-
take advantage of the french-name.fr domain strength and incoming links
-
scalable: can add more languages without registering and building SE position for each one individually
-
potential issues with duplicate content as we are not able to geotarget differenly on web master tools of google
-
potential dilution of each page's strength as we will now have much more pages under the same domain (double the pages basically) - is this a valid concern?
-
usability/marketing concerns as the name of the site is not in english (but then people looking for a house in France would be at least not completely alien to it)
what are your thoughts on this?
thanks in advance
-
-
Google have written allot about this subject and cuffs debated it one of his vids, here is a few links:
http://googlewebmastercentral.blogspot.com/2010/03/working-with-multi-regional-websites.html
-
Thanks Ryan
That's a good point about the fr domain name
In the example the content is about France but still the audience would be international.
-
All things being equal, the approach you labeled as "B" is recommended.
The biggest concern I have is choosing the ".fr" domain to accommodate non-French countries. Google restricts the flexibility for geo-targeting country specific domains. If you can use your .com domain then implement your plan, that would be best. I realize this is a french company but the .com domain would be strongly preferred for any form of international business.
The doubling of pages is not a concern. It is fine to duplicate the site within a country folder as long as the content is properly modified to target the specific country. This includes more then just a pure language translation. There are units of measurements and cultural changes to be considered as well.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Splitting a site into 2 international sites
Hi all, I have a client that currently has a .com domain that ranks in both the US and the UK for various search terms. They have identified a need to provide different information for UK and US visitors which will require 2 versions of all pages. If we set up a .co.uk domain and keep the .com obviously that will be a brand new UK site which will have zero rankings. Any suggestions as to the best way to introduce this second version of the content without losing UK rankings? Thanks
International SEO | | danfrost0 -
E-commerce : 1 site per country or 1 site per language?
I'm working with an European e-commerce; they already have a French website with a .fr domain. They want to target the Belgium with a .be domain and the Nederland with a .nl domain. Belgium = 50% dutch, 50% French. Is it better to do 3 websites, one per country, or 2 websites, one per language ? Thinking to SEO, costs, VAT management, what is your opinion?
International SEO | | johnny1220 -
Redirect the main site to keyword-rich subfolder / specific page for SEO
Hi, I have two questions. Question 1: is it worthwhile to redirect the main site to keyword-rich subfolder / specific page for SEO? For example, my company's webpage is www.example.com. Would it make sense to redirect (301) the main site to address www.example.com/service-one-in-certain-city ? I am asking this as I have learned that it is important for SEO to have keywords in the URL, and I was thinking that we could do this and include the most important keywords to the subfolder / specific URL. What are the pros and cons of this? Should I create folders or pages just the sake of keywords? Question 2: Most companies have their main URL shown as www.example.com when you access their domain. However, some multi-language sites show e.g. www.example.com/en or www.example.com/en/main when you type the domain to your web browser to access the site. I understand that this is a common practice to use subdomains or folders to separate different language versions. My question is regarding subfolders. Is it better to have only the subfolder shown (www.example.com/en) or should I also include the specific page's URL after the subfolder with keywords (www.example.com/en/main or www.example.com/en/service-one-in-certain-city)? I don't really understand why some companies show only the subfolder of a specific language page and some the page's URL after the subfolder. Thanks in advance, Sam
International SEO | | Awaraman1 -
URL Structure for Multilingual Site With Two Major Locations
We're working on a hotel site that has two major locations. Locations currently live in separate domains. The sites target users from around the world and offer content in multiple languages. The client is looking into migrating all content into one domain and creating sub-folders for each location. The sites are strong in organic search, but they want to expand the keyword portfolio to broader keywords regarding activities, which they also market on their sites. The goal is to scale their domain authority as they have a really strong brand. The question is which would be a preferred URL structure in case content is finally migrated into one domain? - (we have doubts about were the lang folder should be placed as each location has different amenities and services). Here is what we had in mind: domain.com – this is the homepage domain.com/location-1 – to target English visitors domain.com/location-2 – to target English visitors domain.com/es/location-1 – to target Spanish visitors domain.com/es/location-2 – to target Spanish visitors
International SEO | | burnseo0 -
Multiple domains for one site / satellite domains
Hi, I know this has been asked a few times before but I want to clarify everything my own head. We've recently relaunched a website for a client that combined three existing sites into one. The new site is http://www.gowerpensions.com/ I've added 301 rewrite rules to the three old domains to to point to the correct page on the new website, i.e the old contact page goes to the new one, the about page to the new about page etc, etc. The old domains are thehorizonplan.com, horizonqrops.com and horizonqnups.com. I've informed Google Webmaster Tools of the change. The client also has several other domains such as horizonpensions.com and qnupscheme.com. Am I correct in thinking I should not park these domains on top of the gowerpensions.com website as this will be seen as duplicate content? I don't think there is anything linking to these domains. They might not even be listed in Google. With the thehorizonplan.com, horizonqrops.com and horizonqnups.com domains there are existing links to them, but will parking these on top of gowerpensions.com cause a problem, or should I keep my 301 redirects forever? Would a better strategy be to make microsites on all of the satellite domains that link to the main one to create more relevant links? If this is the case then I'd need to fix any third party links to the old horizon domains. I hope that makes sense. Thanks Ric
International SEO | | BWIRic0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0 -
What is the best SEO site structure for multi country targeting?
Hi There, We are an online retailer with four (and soon to be five) distinct geographic target markets (we have physical operations in both the UK and New Zealand). We currently target these markets like this: United Kingdom (www.natureshop.co.uk) New Zealand (www.natureshop.co.nz) Australia (www.natureshop.com/au) - using a google web master tools geo targeted folder United States (www.natureshop.com) - using google web master tools geo targeted domain Germany (www.natureshop.de) - in german and yet to be launched as full site We have various issues we want to address. The key one is this: our www.natureshop.co.uk website was adversely affected by the panda update on April 12. We had some external seo firms work on this site for us and unfortunately the links they gained for us were very low quality, from sometimes spammy sites and also "keyword" packed with very littlle anchor text variation. Our other websites (the .co.nz and .com) moved up after the updates so I can only assume our external seo consultants were responsible for this. I have since managed to get them to remove around 70% of these links and we have bought all seo efforts back in house again. I have also worked to improve the quality of our content on this site and I have 404'ed the six worst affected pages (the ones that had far too many single phrase anchor text links coming into them). We have however not budged much in our rankings (we have made some small gains but not a lot). Our other weakness's are not the fastest page load times and some "thin" content. We are on the cusp (around 4 weeks away) of deploying a brand new platform using asp.net MVP with N2 and this looks like it will address our page load speed issues. We also have been working hard on our content building and I believe we will address that as well with this release. Sorry for the long build up, however I felt some background was needed to get to my questions. My questions are: Do you think we are best to proceed with trying to get our www.natureshop.co.uk website out of the panda trap or should we consider deploying a new version of the site on www.natureshop.com/uk/ (geo targeted to the UK)? If we are to do this should we do the same for New Zealand and Germany and redirect the existing domains to the new geo targeted folders? If we do this should we redirect the natureshop.co.uk pages to the new www.natureshop.com/uk/ pages or will this simply pass on the panda "penalty". Will this model build stronger authority on the .com domain that benefit all of the geo targeted sub folders or does it not work this way? Finally can we deploy the same pages and content on the different geo targeted sub folders (with some subtle regional variations of spelling and language) or will this result in a duplicate content penalty? Thank you very much in advance to all of you and I apologise for the length and complexity of the question. Kind Regards
International SEO | | ConradC
Conrad Cranfield
Founder: Nature Shop Ltd0 -
I have a site that has 65 different versions of itself.
I've just started managing a site that serves over 50 different countries and the entire web enterprise is being flagged for duplicate content because there is so much of it. What's the best approach to stop this duplicate content, yet serve all of the countries we need to?
International SEO | | Veracity0