Best SEO practice for multiple languages in website
-
HI,
We would like to include multiple languages for our global website. What's the best practice to gain from UI and SEO too. Can we have auto language choosing website as per browsing location? Or dedicated pages for important languages like www.website.com/de for German. If we go for latter, how about when users browsing beside language page as they will be usually in English
-
Hi,
Thanks for the reply.
We are more interested in folders than different TLDs. In this case, how we gonna feed other pages of website to other language visitors?
For french, they will land on example.com/fr/ and if they browse to other pages, should all other pages must have French? If so what's the way to present "French" content to them? Just an auto-translation? Or French written content? Any best example site you can refer?
-
There are more than one possible answer for this. I believe it depends on your business needs.
You should let Google know about all the versions of the site you have via hreflang, that´s a line of code you must insert within the Let me give you an example:
Those lines are letting Google know that a specific page has an equivalent in spanish, french, portuguese and english.
All the versions should be included, even the one the visitor is actually reading.
That's the basic step, you can choose between having a folder per idiom (example.com/de) or a more specific TLD (example.de) a specific domain is great to point for specific countries and the folder structure is more suitable for pointing to different languages which may cover many countries (spanish may point only to Argentina if you use example.com.ar or to all spanish speaking countries if you choose example.com/es)
I'm sure the community will help you dig deeper on this, let me add some links to get more info on this.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO QA automation of large websites
Can you share your experiences in managing SEO QA automation of large websites with millions of pages?
Intermediate & Advanced SEO | | terentyev
what are the things you are regularly testing for, besides the most obvious - hreflangs/canonicals, robots.txt, sitemap, non-200 status codes, redirect rules?
do you use in-house developed tools or external tools?
if external - which ones?
how do you run your QA automation scripts? external server or some online tools? upon every release or hourly/daily/monthly?0 -
Does anyone know how dynamic/personalized website content affects SEO?
A client using Marketo has inquired about personalizing their website content to be personalized based on a persona. To be clear, I'm talking about key website pages, maybe even the Home page, not PPC/campaign specific landing pages. For example, areas of on the site would change to display content differently to a CEO vs a sales person. I'm new to marketing automation and don't exactly know how this piece works. Hoping someone here has experience or can provide pros/cons guidance. How would search engines work with this type of page? Here's Marketo's site explaining what it does: https://docs.marketo.com/display/public/DOCS/Web+Personalization+-+RTP
Intermediate & Advanced SEO | | Flock.Media0 -
What's the best way to A/B test new version of your website having different URL structure?
Hi Mozzers, Hope you're doing good. Well, we have a website, up and running for a decent tenure with millions of pages indexed in search engines. We're planning to go live with a new version of it i.e a new experience for our users, some changes in site architecture which includes change in URL structure for existing URLs and introduction of some new URLs as well. Now, my question is, what's the best way to do a A/B test with the new version? We can't launch it for a part of users (say, we'll make it live for 50% of the users, an remaining 50% of the users will see old/existing site only) because the URL structure is changed now and bots will get confused if they start landing on different versions. Will this work if I reduce crawl rate to ZERO during this A/B tenure? How will this impact us from SEO perspective? How will those old to new 301 URL redirects will affect our users? Have you ever faced/handled this kind of scenario? If yes, please share how you handled this along with the impact. If this is something new to you, would love to know your recommendations before taking the final call on this. Note: We're taking care of all existing URLs, properly 301 redirecting them to their newer versions but there are some new URLs which are supported only on newer version (architectural changes I mentioned above), and these URLs aren't backward compatible, can't redirect them to a valid URL on old version.
Intermediate & Advanced SEO | | _nitman0 -
What are the best practices for microdata?
Not too long ago, Dublin Core was all the rage. Then Open Graph data exploded, and Schema seems to be highly regarded. In a best-case scenario, on a site that's already got the basics like good content, clean URLs, rich and useful page titles and meta descriptions, well-named and alt-tagged images and document outlines, what are today's best practices for microdata? Should Open Graph information be added? Should the old Dublin Core be resurrected? I'm trying to find a way to keep markup light and minimal, but include enough microdata for crawlers to get a better sense of the content and its relationships to other subdomains and sites.
Intermediate & Advanced SEO | | WebElaine0 -
An affiliate website uses datafeeds and around 65.000 products are deleted in the new feeds. What are the best practises to do with the product pages? 404 ALL pages, 301 Redirect to the upper catagory?
Note: All product pages are on INDEX FOLLOW. Right now this is happening with the deleted productpages: 1. When a product is removed from the new datafeed the pages stay online and are showing simliar products for 3 months. The productpages are removed from the categorie pages but not from the sitemap! 2. Pages receiving more than 3 hits after the first 3 months keep on existing and also in the sitemaps. These pages are not shown in the categories. 3. Pages from deleted datafeeds that receive 2 hits or less, are getting a 301 redirect to the upper categorie for again 3 months 4. Afther the last 3 months all 301 redirects are getting a customized 404 page with similar products. Any suggestions of Comments about this structure? 🙂 Issues to think about:
Intermediate & Advanced SEO | | Zanox
- The amount of 404 pages Google is warning about in GWT
- Right now all productpages are indexed
- Use as much value as possible in the right way from all pages
- Usability for the visitor Extra info about the near future: Beceause of the duplicate content issue with datafeeds we are going to put all product pages on NOINDEX, FOLLOW and focus only on category and subcategory pages.0 -
Best practice for listings with outbound links
My site contains a number of listings for charities that offer various sporting activities for people to get involved in order to raise money. As part of the listing we provide an outbound link for the user to find out more info about each of the charities and their activities. Currently these listings are blocked in the robots.txt for fear that we may be viewed as a 'link farm or spam site' (as there are hundreds of charities listed on the scrolling page) but these links out are genuine and provide benefits and are a useful resource for the user and not paid links. What I'd like to do is make these listings fully crawlable and indexable to increase our search traffic to these listing, but I'm not sure whether this would have a negative impact on our Pagerank with Google potentially viewing all these outbound links as 'bad' or 'paid links', Would removing the listing pages from our robots.txt and making all the outbound links 'nofollow' be the way forward to allow us to properly index the listings without being penalised as some kind of link farm or spam site? (N.B. I have no interest in passing link juice to the external charity websites)
Intermediate & Advanced SEO | | simon_realbuzz0 -
Website Restructure - Good or Bad for SEO?
Due to the fact that we aren't in the #1 position, (dropped from #5 to page 2 - You have to love Devs and IT), our heads have hired a SEO Audit/Consultant company to review everything we are doing. I would like to post some of the things they are telling us to do, in which I don't 100% agree with and would like some other professional feedback. Especially since their site isn't marketed very well. http://www.trupanionpetinsurance.com Disclaimer: (this site was a complete nightmare when I started a year and a half ago. Yes, there are many issues that still need to be addressed.) Website Restructure I agree we totally need to restructure our website. I have no idea what the previous SEO guy was thinking. The new SEO company is telling us that the structure is a big part of SEO. I don't believe so, but besides a little loss in 301 juice, is there any other downfalls? Are there any real benefits? Similar question asked the other day (and answered by me): http://www.seomoz.org/q/don-t-want-to-lose-page-rank-what-s-the-best-way-to-restructure-a-url-other-than-a-301-redirect
Intermediate & Advanced SEO | | Trupanion1 -
What are best SEO practices for product pages of unique items when the item is no longer available?
Hello, my company sells used cars though a website. Each vehicle page contains photos and details of the unit, but once the vehicle is sold, all the contents are replaced by a simple text like "this vehicle is not available anymore".
Intermediate & Advanced SEO | | Darioz
Title of the page also change to a generic one.
URL remains the same. I doubt this is the correct way of doing, but I cannot understand what method would be better. The improvement I am considering for pages of no longer available vehicles is this: keep the page alive but with reduced vehicle details, a text like: this vehicles is not available anymore and automatic recommendations for similar items. What do you think? Is this a good practice or do you suggest anything different? Also, should I put a NOINDEX tag on the expired vehicles pages? Thank you in advance for your help.0