Best way to deal with multiple languages
-
Hey guys,
I've been trying to read up on this and have found that answers vary greatly, so I figured I'd seek your expertise.
When dealing with the url structure of a site that is translated into multiple languages, is it better SEO wise to structure a site like this : domain.com/en domain.com/it etc
or to simply add url modifiers like domain.com/?lang=en domain.com/?lang=it
In the first example, I'm afraid google might see my content as duplicate even though its in a different language.
-
I'd concur with this approach - however you can only Geo-Target with Google Webmaster Tools, not language target.
You might be better to implement rel="alternate" hreflang = "x" via your sitemaps to help Google understand which content is intended for which audience. See - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
I hope this helps,
Hannah
-
Careful with this
Content in different languages shouldn't be viewed as duplicate, however I have seen sites run into problems when they have say US English and UK English content which is very similar.
-
I always use the /es approach and you can use Google Webmaster Tools to Geo target different sub- directories
-
Its a fact that different languages are not considered as duplicate content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How Best to Handle Inherited 404s on Purchased Domain
We purchased a domain from another company and migrated our site over to it very successfully. However, we have one artifact of the original domain in that there was a page that was exploited by other sites on the web. This page allowed you to pass any URL to it and redirect to that URL (e.g. http://example.com/go/to/offsite_link.asp?GoURL=http://badactor.com/explicit_content). This page does not exist on our site so the results always go to a 404 on our site. However, we find that crawlers are still attempting to access these invalid pages. We have disavowed as many of the explicit sites as we can, but still some crawlers come looking for those links. We are considering blocking the redirect page in our robots.txt but we are concerned that the links will remain indexed but uncrawlable. What's the best way to pull these pages from search engines and never have them crawled again? UPDATE: Clarifying that what we're trying to do it get search engines to just never try to get to these pages. We feel the fact they're even wasting their time on getting a 404 is what we're trying to avoid. Is there any reason we shouldn't just block these in our robots.txt?
Intermediate & Advanced SEO | | russell_ms1 -
Best Tool For Finding Related Keywords?
What is the best tool for finding related keywords to the primary keyword we are targetting? Cheers
Intermediate & Advanced SEO | | webguru20140 -
SEO Best practice for competitions
I am considering running a competition and wanted to get some feedback on SEO Best Practice. We will have a unique competition URL - following the completion of the competition it will be 301'd to home page Every entrant will be given a unique URL for the competition to share, if someone enters using there URL they get an extra ticket. This means we will create a large number of new unique URL's over a short period of time, the pages however will have the same content. Is this potentially bad for Duplicate content?Any advice? Perhaps a canonical tag on all unique competition entrant URLs? Any other considerations?
Intermediate & Advanced SEO | | RobertChapman0 -
What is the best way to scrape serps for targeted keyword research?
Wanting to use search operators such as "KEYWORD inurl:blog" to identify potential link targets, then download target url, domain and keyword into an excel file. Then use SEOTools to evaluate the urls from the list. I see the link aquisition assistant in the Moz lab, but the listed operators are limited. Appreciate any suggestions on doing this at scale, thanks!
Intermediate & Advanced SEO | | Qualbe-Marketing-Group0 -
Wich is the best way to manage dup content in a intenational Portal?
We have a portal wich is only in spain and we started to internazionalized it to Argentina, Mexico and Colombia. Before we had a .com domain with content only for spain and now that domain is going to be global. so.. .com contains all the content and you can filter for country .es contains spanish content .com.ar contanis argenitian content Every thing is ok but the problem is that there is a content (online courses) that is in every country. What we thougt to do is: -online contect url canonical to .com domain -Geo content url canonical to .es, .com.ar domain (depending on the geo) Filters besidese .com and .es can give similar resoults we do not use canonical url or we will follow the rule above (if there is geo in .com filter then canonical to geo domain and if the filter is (online courses) then canonical to .com domain) What do you think about that? Thank you in advance.
Intermediate & Advanced SEO | | ofuente0 -
Multiple country site versions and hosting
If I intend to have 5 versions of an e-commerce site, one for each countries marketplace, would it be best to: a) Host each site in its own country b) Host all sites from 1 server/1 country I'm presuming that hosting in each country is better? But how quantifiable is this, and is it worth the extra hosting and management costs? Any advice gratefully received..
Intermediate & Advanced SEO | | cottamg0 -
Best way to find all url parameters?
In reference to http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html, what is the best way to find all of the parameters that need to be addressed? Thanks!
Intermediate & Advanced SEO | | nicole.healthline0 -
What's your best hidden SEO secret?
Don't take that question too serious but all answers are welcome 😉 Answer to all:
Intermediate & Advanced SEO | | petrakraft
"Gentlemen, I see you did you best - at least I hope so! But after all I suppose I am stuck here to go on reading the SEOmoz blog if I can't sqeeze more secrets from you!9