Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Multilingual site with untranslated content
-
We are developing a site that will have several languages.
There will be several thousand pages, the default language will be English. Several sections of the site will not be translated at first, so the main content will be in English but navigation/boilerplate will be translated.
We have hreflang alternate tags set up for each individual page pointing to each of the other languages, eg in the English version we have:
etc
In the spanish version, we would point to the french version and the english version etc.
My question is, is this sufficient to avoid a duplicate content penalty for google for the untranslated pages?
I am aware that from a user perspective, having untranslated content is bad, but in this case it is unavoidable at first.
-
Thanks for your comments Gianluca.
I think Google's guidelines are somewhat ambiguous. Here it does state that "if you're providing the same content to the same users on different URLs (for instance, if both example.de/ and example.com/de/ show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately."
https://support.google.com/webmasters/answer/182192?hl=en
I think you've explained it nicely though.
-
At first that would be fine.
Said that, this is a very specific case where you can use both hreflang and cross domain rel="canonical".
Remember that these two mark-up are totally independent one each other, though.
If you use them both, as I wrote replying to Yusuf, from one side you are telling Google that you want it to show a determined URL for a determined geo-targeted country/language, and from other side you are also telling Google that that geo-targeted URL is the exact copy of the canonical one.
What Google will do will be showing the geo-targeted URL in the SERPs, but with the Title and Meta Description of the canonical one.
One more thing, and this a strong reason for urging a complete translation in a short period of time:
if the content of the URL of the French site, for instance, is in English, you cannot put "fr-FR" in the hreflang, but "en-FR". This is a consequence: that the URL will tend to be shown only for English queries done in Google.fr, not for French queries... and that mean loosing a lot of traffic opportunities.
-
Yusuf,
I'm sorry but I've to correct you.
If two pages are in the same language, but they are targeting different countries (i.e.: USA and UK), even if the content is the same or substantially the same, then you not only can use the hreflang, but also you should use it in order to tell Google that one URL must be shown to US people and the other to UK ones.
Obviously, if you want you can always decide to use the cross domain rel="canonical" instead.
Remember, though, that in that case - if you are using the hreflang - that Google will show the snippets' components (title and meta description) of the canonical URL, even it will show the geotargeted URL. Instead, if you opted to not use the hreflang, people will see the canonical URL snippet (web address included).
-
Have you taken a look through the following :
https://support.google.com/webmasters/answer/182192?hl=en#1
https://sites.google.com/site/webmasterhelpforum/en/faq-internationalisation
"
Duplicate content and international sites
Websites that provide content for different regions and in different languages sometimes create content that is the same or similar but available on different URLs. This is generally not a problem as long as the content is for different users in different countries. While we strongly recommend that you provide unique content for each different group of users, we understand that this may not always be possible. There is generally no need to "hide" the duplicates by disallowing crawling in a robots.txt file or by using a "noindex" robots meta tag. However, if you're providing the same content to the same users on different URLs (for instance, if both
example.de/
andexample.com/de/
show German language content for users in Germany), you should pick a preferred version and redirect (or use the rel=canonical link element) appropriately. In addition, you should follow the guidelines on rel-alternate-hreflang to make sure that the correct language or regional URL is served to searchers." -
Hi Jorge
The rel="alternate" hreflang="x" tag is not suitable for pages that are in the same language as these are essentially duplicates rather than alternative language versions.
I'd use the rel="canonical" tag to point to the main page until the translations of those pages are available.
Webmaster Tools should allow you to see any issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical: Same content but different countries
I'm building a website that has content made for specific countries. The url format is: MyWebsite.com/<country name="">/</country> Some of the pages for <specific url="">are the same for different countries, the <specific url="">would be the same as well. The only difference would be the <country name="">.</country></specific></specific> How do I deal with canonical issues to avoid Google thinking I'm presenting the same content?
On-Page Optimization | | newbyguy0 -
Reducing number of site pages?
Hi, I am looking through my site structure and I have a lot of pages left over from the days of article keywords. Probably 7 or 8 years ago, someone sold my husband on article key word pages. I have slowly gotten rid of a lot of them as they have fallen out out of the ranks. I would like to get rid of the rest, probably 5 or 6 pages. Will it hurt my rankings to delete pages and redirect them? My customers really like the simplicity of our site and I want to keep it that way, plus clean up flags that Moz is telling me is a problem. I think its easier to keep less pages top notch than have to worry with a lot of them. Especially since my customers aren't viewing them. Thanks in advance!
On-Page Optimization | | CalicoKitty20000 -
Thoughts on archiving content on an event site?
I have a few sites that are used exclusively to promote live events (ex. tradeshows, conference, etc). In most cases these sites content fewer than 100 pages and include information for the upcoming event with links to register. Some time after the event has ended, we would redesign the site and start promoting next years event...essentially starting over with a new site (same domain). We understand the value that many of these past event pages have for users who are looking for info from the past event and we're looking for advice on how best to archive this content to preserve for SEO. We tend to use concise urls for pages on these sites. Ex. www.event.com/agenda or www.event.com/speakers. What are your thoughts on archiving the content from these pages so we can reuse the url with content for the new event? My first thought is to put these pages into an archive, like www.event.com/2015/speakers. Is there a better way to do this to preserve the SEO value of this content?
On-Page Optimization | | accessintel0 -
Duplicate content on partner site
I have a trade partner who will be using some of our content on their site. What's the best way to prevent any duplicate content issues? Their plan is to attribute the content to us using rel=author tagging. Would this be sufficient or should I request that they do something else too? Thanks
On-Page Optimization | | ShearingsGroup0 -
Duplicate Content for Men's and Women's Version of Site
So, we're a service where you can book different hairdressing services from a number of different salons (site being worked on). We're doing both a male and female version of the site on the same domain which users are can select between on the homepage. The differences are largely cosmetic (allowing the designers to be more creative and have a bit of fun and to also have dedicated male grooming landing pages), but I was wondering about duplicate pages. While most of the pages on each version of the site will be unique (i.e. [male service] in [location] vs [female service] in [location] with the female taking precedent when there are duplicates), what should we do about the likes of the "About" page? Pages like this would both be unique in wording but essentially offer the same information and does it make sense to to index two different "About" pages, even if the titles vary? My question is whether, for these duplicate pages, you would set the more popular one as the preferred version canonically, leave them both to be indexed or noindex the lesser version entirely? Hope this makes sense, thanks!
On-Page Optimization | | LeahHutcheon0 -
Duplicate Content from on Competitor's site?
I've recently discovered large blocks of content on a competitors site that has been copy and pasted from a client's site. From what I know, this will only hurt the competitor and not my client since my guy was the original. Is this true? Is there any risk to my client? Should we take action? Dino
On-Page Optimization | | Dino640 -
Is content aggregation good SEO?
I didn't see this topic specifically addressed here: what's the current thinking on using content aggregation for SEO purposes? I'll use flavors.me as an example. Flavors.me lets you set up a domain that pulls in content from a variety of services (Twitter, YouTube, Flickr, RSS, etc.). There's also a limited ability to publish unique content as well. So let's say that we've got MyDomain.com set up, and most of the content is being drawn in from other services. So there's blog posts from WordPress.com, videos from YouTube, a photo gallery from Flickr, etc. How would Google look at this scenario? Is MyDomain.com simply scraped content from the other (more authoritative) sources? Is the aggregated content perceived to "belong" to MyDomain.com or not? And most importantly, if you're aggregating a lot of content related to Topic X, will this content aggregation help MyDomain.com rank for Topic X? Looking forward to the community's thoughts. Thanks!
On-Page Optimization | | GOODSIR0 -
Is it better to drip feed content?
Hi All, I've assembled a collection of 5 closely related articles each about 700 words for publishing by linking to them from on one of my pages and would appreciate some advice on the role out of these articles. Backround: My site is a listings based site and a majority of the content is published on my competitors sites too. This is because advertisers are aiming to spread there adverts wide with the hope of generating more responses. The page I'm targeting ranks 11th but I would like to link it to some new articles and guides to beef it up a bit. My main focus is to rank better for the page that links to these articles and as a result I write up an introduction to the article/guide which serves as my unique content. Question: Is it better to drip feed the new articles onto the site or would it be best to get as much unique content on as quickly as possible to increase the ratio of unique content vs. external duplicate content on the page that links to these articles**?** Thank you in advance.
On-Page Optimization | | Mulith0