vs.
-
I have a site that is based in the US but each page has several different versions for different regions. These versions live in folders (/en-us for the US English version, /en-gb for the UK English version, /fr-fr for the French version, etc.). Obviously, the French pages are in French. However, there are two versions of the site that are in English with little variation of the content. The pages all have a tag to indicate the language the page is in. However, there are no <hreflang>tags to indicate that the pages are the same page in two different languages.</hreflang>
My question is, do I need to go through and add the <hreflang>tags to each page to reference each other and identify to Google that these are duplicate content issues, but different language versions of the same content? Or, will Google figure that our from the tag?</hreflang>
-
Without Hreflang markup the en-US and en-GB pages will be treated as duplicate content. You do not want that. In fact, even with hreflang the two may be considered duplicates if there isn't enough differentiated content.
Also, be careful with canonicals. You shouldn't specify the en-US page as the canonical URL for the fr page. The fr page is its own page and you should use hreflang to specify other language versions.
-
Thanks, Martijn. The pages all have self-referencing canonical tags (except for the blog posts which have all non-US English pages referencing the US English version as the canonical page.
I'm going to be safe and implement the HREF Lang tags. Do you think the self-referencing canonical tags on each version of the page are going to cause a problem?
-
Hi Mike,
I definitely wouldn't trust only on using the HTML Lang Tag, as that's something that isn't used a lot by sites in the end. Plus it's a vague indicator to Google that that is the actual language that is being used there. I would go with stating the different pages with the HREF Lang tag and worst case go with a canonical tag implementation.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WWW vs Non WWW for EXISTING site.
This one has sort of been asked already but I cannot find an answer. When we evaluate a new SEO client, previously with Majestic we would review the root domain vs sub domain (www) for which had the higher Trust Flow and Citation flow, and if there was a major difference, adjust the Google indexed domain to the higher peforming one. Is there a way to do this with Moz, Domain Authority, and Sub Domain authority are always returning the same DA for me. Thanks in advance.
Technical SEO | | practiceedge10 -
Link to AMP VS AMP Google Cache VS Standard page?
Hi guys, During the link building strategy, which version should i prefer as a destination between: to the normal version (php page) to the Amp page of the Website to the Amp page of Google Cache The main doubt is between AMP of the website or standard Version. Does the canonical meta equals the situation or there is a better solution? Thank you so mutch!
Technical SEO | | Dante_Alighieri0 -
Content change within the same URL/Page (UX vs SEO)
Context: I'm asking my client to create city pages so he can present all of his appartements in that specific sector so i can have a page that ranks for "appartement for rent in +sector". The page will present a map with all the sector so the user can navigate and choose the sector he wants after he landed on the page. Question: The UX team is asking if we absolutly need to reload the sector page when the user is clicking the location on the map or if they can switch the content within the same page/url once the user is on the landing page. My concern: 1. Can this be analysed as duplicate content if Google can crawl within the javascript app or if Google only analyse his "first view" of the page. 2. Do you consider that it would be preferable to keep the "page change" so i'm increasing the number of page viewed ?
Technical SEO | | alexrbrg0 -
Https vs http two different domains?
If i visit mywebsite.com.au, www.mywebsite.com.au and http://www.mywebsite.com.au - i get one website BUT if I visit https://www.mywebsite.com.au I get a different website - I also get a untrusted website warning The logo in the bottom right of the https: website is the name of the webdesigner where the website is hosted. Is this a normal practice?
Technical SEO | | GardenBeet0 -
Pro's & contra's: http vs https
Hi there, We are planning to take the step and go from http to https. The main reason to do this, is to mean trustfull to our clients. And of course the rumours that it would be better for ranking (in the future). We have a large e-commerce site. A part of this site ia already HTTPS. I've read a lot of info about pro's and contra's, also this MOZ article: http://moz.com/blog/seo-tips-https-ssl
Technical SEO | | Leonie-Kramer
But i want to know some experience from others who already done this. What did you encountered when changing to HTTPS, did you had ranking drops, or loss of links etc? I want to make a list form pro's and contra's and things we have to do in advance. Thanx, Leonie0 -
"non-WWW" vs "WWW" in Google SERPS and Lost Back Link Connection
A Screaming Frog report indicates that Google is indexing a client's site for both: www and non-www URLs. To me this means that Google is seeing both URLs as different even though the page content is identical. The client has not set up a preferred URL in GWMTs. Google says to do a 301 redirect from the non-preferred domain to the preferred version but I believe there is a way to do this in HTTP Access and an easier solution than canonical.
Technical SEO | | RosemaryB
https://support.google.com/webmasters/answer/44231?hl=en GWMTs also shows that over the past few months this client has lost more than half of their backlinks. (But there are no penalties and the client swears they haven't done anything to be blacklisted in this regard. I'm curious as to whether Google figured out that the entire site was in their index under both "www" and "non-www" and therefore discounted half of the links. Has anyone seen evidence of Google discounting links (both external and internal) due to duplicate content? Thanks for your feedback. Rosemary0 -
Secure Vs Non-Secure Redirects
I have a client who has a lot of duplicate pages on their site. The pages are secure and then non secure counterparts. Not sure why they have this in place but i recomended that they redirect on to the other or vice versa using 301 redirects. I am getting some questions as to why they should do this. Does anyone have a good document outlining the reasoning behind this? For me its just a matter of cleaning up duplicate content but wondering if there is any technical data out there.
Technical SEO | | gkellyiii0 -
Using Thesis as blog platform vs. Tumblr
I read a lot of advantages by using Thesis as a platform for blogging, but I like the themes and other plugins from Tumblr. Are there equivalents at Tumblr to the Thesis benefits so I can go a head and go with Tumblr?
Technical SEO | | HyperOffice0