Correct site internationalization strategy
-
Hi,
I'm working on the internationalization of a large website; the company wants to reach around 100 countries. I read this Google doc: https://support.google.com/webmasters/answer/182192?hl=en in order to design the strategy.
The strategy is the following:
For each market, I'll define a domain or subdomain with the next settings:
- Leave the mysitename.com for the biggest market in which it has been working for years, and define the geographic target in Google search console.
- Reserve the ccTLD domains for other markets
- In the markets where I'm not able to reserve the ccTLD domains, I'll use subdomains for the .com site, for example us.mysitename.com, and I'll define in Google search console the geographic target for this domain.
Each domain will only be in the preferred language of each country (but the user will be able to change the language via cookies).
The content will be similar in all markets of the same language, for example, in the .co.uk and in .us the texts will be the same, but the product selections will be specific for each market.
Each URL will link to the same link in other countries via direct link and also via hreflang. The point of this is that all the link relevance that any of them gets, will be transmitted to all other sites.
My questions are:
- Do you think that there are any possible problems with this strategy?
- Is it possible that I'll have problems with duplicate content? (like I said before, all domains will be assigned to a specific geographic target)
- Each site will have around 2.000.000 of URLs. Do you think that this could generate problems? It's possible that only primary and other important locations will have URLs with high quality external links and a decent TrustRank.
- Any other consideration or related experience with a similar process will be very appreciated as well.
Sorry for all these questions, but I want to be really sure with this plan, since the company's growth is linked to this internationalization process.
Thanks in advance!
-
Thanks so much Gianluca, I'll take all your ideas into account.
-
You wrote this, and I'd like you to explain it better:
Each domain will only be in the preferred language of each country (but the user will be able to change the language via cookies).
Why people - for instance Italians - should be even feeling the need to switch the language from Italian to English?
Sincerely, I find it useless.
What you should do is doing like Amazon does: let people visit whatever version they want. For instance (I live in Spain), when I am in the UK and I want to buy something in Amazon, I visit amazon.es. Even if Amazon knows that I'm in the UK, and advices me that maybe I may prefer to shop in the .co.uk website, it lets me stay, navigate and buy from the .es one.
You, then, say this:
Each URL will link to the same link in other countries via direct link and also via hreflang. The point of this is that all the link relevance that any of them gets, will be transmitted to all other sites.
This is not that true. At least, not literally. In fact, the PageRank any page of yours will earn via internal and external links will just partly be passed to the other country versions corresponding pages. This because the PageRank flows through every link present in a page, both internal and external links, and "evaporates" in case of nofollow links.
About your questions:
- Do you think that there are any possible problems with this strategy?
Overall it is correct (being the only doubt the "cookie" thing you talked about)
Is it possible that I'll have problems with duplicate content? (like I said before, all domains will be assigned to a specific geographic target)
If you use the hreflang, you should not have issues related to duplicated content.
Each site will have around 2.000.000 of URLs. Do you think that this could generate problems? It's possible that only primary and other important locations will have URLs with high quality external links and a decent TrustRank.
Having millions of URLs should not be a problem... if it was so sites like Etsy, Home Depot or Amazon would be suffering it, wouldn't they? When it comes to Big Sites, the most important thing is having a very solid architecture and work very well everything internal linking.
Any other consideration or related experience with a similar process will be very appreciated as well.
When implementing the hreflang annotations, try not using as many hreflang as country versions are present.
In other words, apart the home page (for obvious localized brand visibility and for avoiding having, for instance, the .com version outranking the local one for being more authoritative), in the internal pages use only the hreflang annotation in order to suggest Google what version to show in case of countries sharing the same language.
For instance, let's take that www.dominio.com/page-a is in English and targeting the USA, then the hreflang annotation would be only relative to all the others URLs of pages in English and targeting others English speaking countries, but you should not add the annotation for the spanish speaking versions or italian.
Why? Because the languages are different and such a strong signal that you don't need to explain to Google that it should present to Spanish speaking users in Spain the URL of the spanish country version instead of the American English one.
-
Thanks Dmitrii.
Any other opinions will be appreciated aswell, this process is really important for this webpage.
-
Hi there.
Everything seems good to me. Just make sure that you use proper hreflangs or canonicals for content, which can potentially be duplicate, make sure that you have proper/correct sitemap and there are no problems with crawlability and accessability.
Good luck
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
MultiRegional site indexing problems
Hello there!!! I have a multiregional site and dealing with some indexing problems. The problem is that google have only indexed our USA site We have: -set up hreflang tags -set up specific subdirectories https://www.website.com/ (en-us site and our main site) https://www.website.com/en-gb https://www.website.com/en-ca https://www.website.com/fr-ca https://www.website.com/fr-fr https://www.website.com/es-es ..... -set up automatic GEO IP redirects (301 redirects) -created a sitemap index and a different sitemap for each regional site -created a google webmaster's tool for each country targeted -created translations for each different language and added some canonicals to the US' site when using English content. The problem is that Google is not indexing our regional sites. I think that the problem is that google is using a US bot when spidering the site, so it will be always redirect to the US version by a 301 redirect. I have used fetch as google with some of our regional folders and asked for "Indexing requested for URL and linked pages", but still waiting. Some ideas?? changing 301 to 302? Really don't know what to do. Thank you so much!!
International SEO | | Alejandrodurn0 -
US site vs New Canadian site for Brand
Hi Everyone, My company decided to create a Canadian site for Canadian customers. How do I slowly transition the US site for ranking in Google.ca? I was thinking of using robots.txt to block Google.Ca from crawling the US site? Can anyone provide some advice oh how this should be managed? Thank you!
International SEO | | JMSCC0 -
International SEO strategy for an ecommerce with 3 languages.
Hi all, I've an ecommerce which ships worldwide and we maintain 3 languages, spanish, english and french. My main business is in Spain, so spanish will be shown in the root domain: http://domain.com/. English will have the /en/ subdomain and french the /fr/ subdomain. After some research, I've concluded that the best strategy for my business is the following. 1º- Translate all the URL's to the correct language, since now are in spanish. 2º- Implement Hreflang tag (with self-reference): Note: Due to the "universality" of english, Does it make sense? Or should I use spanish as default since it's the most important one. 3º- Create the 3 sites in Search Console and only geo targetting french sobdomain to France. Since I really want to boost in France rankings. Do you consider this as a contradiction with ? I could also target country in the hreflang. 4º- Add language tag in each language version: <meta name="language" content="spanish">in http://domain.com/</meta name="language"> <code class="broncode"><meta name="language" content="english">in http://domain.com/en/</meta name="language"></code> <code class="broncode"><meta name="language" content="french">in http://domain.com/fr/</meta name="language"></code> <code class="broncode">5º- Use canonical tag together with hreflang.</code> ``` Any opinion will be very appreciated. Thanks a lot in advance! Best regards.
International SEO | | footd0 -
International Sites - Sitemaps, Robots & Geolocating in WMT
Hi Guys, I have a site that has now been launched in the US having originally just been UK. In order to accommodate this, the website has been set-up using directories for each country. Example: domain.com/en-gb domain.com/en-us As the site was originally set-up for UK, the sitemap, robots file & Webmaster Tools account were added to the main domain. Example: domain.com/sitemap.xml domain.com/robots.txt The question is does this now need changing to make it specific for each country. Example: The sitemap and robots.txt for the UK would move to: domain.com/en-gb/sitemap.xml domain.com/en-gb/robots.txt and the US would have its own separate sitemap and robots.txt. Example : domain.com/en-us/sitemap.xml domain.com/en-us/robots.txt Also in order to Geolocate this in WMT would this need to be done for each directory version instead of the main domain? Currently the WMT account for the UK site is verified at www.domain.com, would this need reverifying at domain.com/en-gb? Any help would be appreciated! Thanks!
International SEO | | CarlWint0 -
Researching (and launching a site within) a foreign language market
Morning peeps, A client wants to clone their website for a foreign language market, obviously swapping all English content for whichever language/market they're looking to target. Any advice on how to research a foreign market (when I only speak English), or perhaps any pitfalls to look out for or advice you might have with a launch like this? thanks
International SEO | | Martin_S0 -
Have I over-optimized (on-site optimization using SEOMoz tool)?
Hey all, Quite new to SEO although I tried to educate myself as much as I could. I just spent (really) a lot of time doing the onsite optimization of a few key pages of a website in 3 languages (in which I'm more or less conversational - with the help of Google Translate). I know content should not be misleading and feel natural. I think the result is natural but I'm not sure... I optimized as much as I could so as to reach an "A" grade as per SEOMoz tool for each page, for 1-4 keywords per page. I feel sometimes I stretched a bit, but not sure what "stretching" is given my lack of experience. So I was wondering if some of you could tell me what they thought and if there was some obvious don'ts in my work. Here are a few key pages I have optimized: The homepage: http://goo.gl/00Fti The search results page: http://goo.gl/b1fxE The property page: http://goo.gl/t2GdY The destinations page: http://goo.gl/0Kc0l Note that the other versions of the page - Italian & Spanish - may be more awkward, so I welcome your opinions for these as well (dropdown on top of the page to change the language). Thanks!!
International SEO | | Philoups0 -
Multiple domains for one site / satellite domains
Hi, I know this has been asked a few times before but I want to clarify everything my own head. We've recently relaunched a website for a client that combined three existing sites into one. The new site is http://www.gowerpensions.com/ I've added 301 rewrite rules to the three old domains to to point to the correct page on the new website, i.e the old contact page goes to the new one, the about page to the new about page etc, etc. The old domains are thehorizonplan.com, horizonqrops.com and horizonqnups.com. I've informed Google Webmaster Tools of the change. The client also has several other domains such as horizonpensions.com and qnupscheme.com. Am I correct in thinking I should not park these domains on top of the gowerpensions.com website as this will be seen as duplicate content? I don't think there is anything linking to these domains. They might not even be listed in Google. With the thehorizonplan.com, horizonqrops.com and horizonqnups.com domains there are existing links to them, but will parking these on top of gowerpensions.com cause a problem, or should I keep my 301 redirects forever? Would a better strategy be to make microsites on all of the satellite domains that link to the main one to create more relevant links? If this is the case then I'd need to fix any third party links to the old horizon domains. I hope that makes sense. Thanks Ric
International SEO | | BWIRic0 -
Site Spider/ Crawler/ Scraper Software
Short of coding up your own web crawler - does anyone know/ have any experience with a good bit of software to run through all the pages on a single domain? (And potentially on linked domains 1 hop away...) This could be either server or desktop based. Useful capabilities would include: Scraping (x-path parameters) of clicks from homepage (site architecture) http headers Multi threading Use of proxies Robots.txt compliance option csv output Anything else you can think of... Perhaps an oppourtunity for an additional SEOmoz tool here since they do it already! Cheers! Note:
International SEO | | AlexThomas
I've had a look at: Nutch
http://nutch.apache.org/ Heritrix
https://webarchive.jira.com/wiki/display/Heritrix/Heritrix Scrapy
http://doc.scrapy.org/en/latest/intro/overview.html Mozenda (does scraping but doesn't appear extensible..) Any experience/ preferences with these or others?0