Best way to deal with multiple languages
-
Hey guys,
I've been trying to read up on this and have found that answers vary greatly, so I figured I'd seek your expertise.
When dealing with the url structure of a site that is translated into multiple languages, is it better SEO wise to structure a site like this : domain.com/en domain.com/it etc
or to simply add url modifiers like domain.com/?lang=en domain.com/?lang=it
In the first example, I'm afraid google might see my content as duplicate even though its in a different language.
-
I'd concur with this approach - however you can only Geo-Target with Google Webmaster Tools, not language target.
You might be better to implement rel="alternate" hreflang = "x" via your sitemaps to help Google understand which content is intended for which audience. See - http://support.google.com/webmasters/bin/answer.py?hl=en&answer=2620865
I hope this helps,
Hannah
-
Careful with this
Content in different languages shouldn't be viewed as duplicate, however I have seen sites run into problems when they have say US English and UK English content which is very similar.
-
I always use the /es approach and you can use Google Webmaster Tools to Geo target different sub- directories
-
Its a fact that different languages are not considered as duplicate content
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Link Juice + multiple links pointing to the same page
Scenario
Intermediate & Advanced SEO | | Mark_Ch
The website has a menu consisting of 4 links Home | Shoes | About Us | Contact Us Additionally within the body content we write about various shoe types. We create a link with the anchor text "Shoes" pointing to www.mydomain.co.uk/shoes In this simple example, we have 2 instances of the same link pointing to the same url location.
We have 4 unique links.
In total we have 5 on page links. Question
How many links would Google count as part of the link juice model?
How would the link juice be weighted in terms of percentages?
If changing the anchor text in the body content to say "fashion shoes" have a different impact? Any other advise or best practice would be appreciated. Thanks Mark0 -
Can we really learn from the best?
Hi All, When I started my site (an eCommerce site) I copied (or tried) a lot of things from the best eCommerce sites I thought were out there. Sites like Zappos, ZALES, Overstock, BlueNile etc. I got hit pretty hard with latest algo changes and I posted my question at Google Webmaster Help forum I received answers from Gurus that we are keyword stuffing etc. (mainly with internal links to product pages but other issues as well). My answer was a link to Zappos and other sites showing that what we do is nothing compared to them. I also showed dozens of SEO "errors" like using H1 tag 10 times per page, not using canonicals and many other issues. The Guru's answer was "LOL" - who am I to compare myself to Zappos. So the question is... Can we take them for example or are they first simply because they are the biggest?
Intermediate & Advanced SEO | | BeytzNet0 -
Multiple 301 Redirects on the same domain name
Hi, I'd appreciate some advice ont he below. I have a website, say www.site.co.uk that has just been redesigned using a new CMS. Previously it had URLs in the format /article.php?id=123, the new site has more friendly urls in the format /articles/article-slug. I have been able to import the old articles into my CMS using the same article IDs and I have created a unique slug for each post. So now in my database, I have the article id (from the querystring) and a slug. However, I have hundreds of old URLs indexed by Google in the format /article.php?id=123 and need to redirect these. My plan was to do the following. 301 Redirect /article.php?id=123 to an intermediate page, in this case /redirect/123. On this intermediate page I would do a database lookup for the article slug, based on the ID from the querystring, create a new URL and perform a second 301 redirect to my new URL E.g. /articles/article-slug-from-database. Whilst this works and keeps the site usable for visitors the two 301 redirects do worry me, as I don;t want Google indexing lots of /redirect/[article id] urls. The other solution is to generate hundreds of htaccess redirect rules that map old url to the new url. The first solution is much cleaner, but the two 301's worry me. Will Google work this out on it's own, is there a better way? Any advice is much appreciated. Cheers Rob
Intermediate & Advanced SEO | | AmyCrompton1 -
How do I use old websites to best effect?
I own a couple of old sites with DA of 15 and 17 which don't really rank for anything, as well as my main site which as DA of 29. Can I forward these domains to my main site to increase the DA of my main site. Alternatively is there any other way of making use of these sites?
Intermediate & Advanced SEO | | benacuity0 -
Best method to update navigation structure
Hey guys, We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me. I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained. The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites. I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO) Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it. I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us. http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well Am I getting into trouble if I do any of the above, or is this the way to go? PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.
Intermediate & Advanced SEO | | CodyWheeler0 -
Website Siloing..best practice?
Hi all I am doing some research this week on the effects of siloing a Magento site. We have about 1,654 pages with approx 1,400 products. We want to silo the website in order to address the internal linking issues and to also focus the customer journey in a more organised way. I need to report all of the possible angles and effects that this will have on the site, prior to implementing it. Does anyone have info on best practice for siloing? I'd appreciate any help... Thanks Nick
Intermediate & Advanced SEO | | Total_Displays0 -
What are the best way to get a new subdomain ranked properly
Our main site (blog with 700 high quality articles) ranks pretty well and we recently launced a rapidly growing forum (55.000 posts in the first 11 weeks) on a subdomain. What would be a good strategy for ranking the forum quickly
Intermediate & Advanced SEO | | xpd1 -
What is the best method for segmenting HTML sitemaps?
Sitemaps create a Table of Contents for web crawlers and users alike. Understanding how PageRank is passed, HTML sitemaps play a critical role in how Googlebot and other crawlers spider and catalog content. I get asked this question a lot and, in most cases, it's easy to categorize sitemaps and create 2-3 category-based maps that can be linked to from the global footer. However, what do you do when a client has 40 categories with 200+ pages of content under each category? How do you segment your HTML sitemap in a case like this?
Intermediate & Advanced SEO | | stevewiideman0