vs.
-
I have a site that is based in the US but each page has several different versions for different regions. These versions live in folders (/en-us for the US English version, /en-gb for the UK English version, /fr-fr for the French version, etc.). Obviously, the French pages are in French. However, there are two versions of the site that are in English with little variation of the content. The pages all have a tag to indicate the language the page is in. However, there are no <hreflang>tags to indicate that the pages are the same page in two different languages.</hreflang>
My question is, do I need to go through and add the <hreflang>tags to each page to reference each other and identify to Google that these are duplicate content issues, but different language versions of the same content? Or, will Google figure that our from the tag?</hreflang>
-
Without Hreflang markup the en-US and en-GB pages will be treated as duplicate content. You do not want that. In fact, even with hreflang the two may be considered duplicates if there isn't enough differentiated content.
Also, be careful with canonicals. You shouldn't specify the en-US page as the canonical URL for the fr page. The fr page is its own page and you should use hreflang to specify other language versions.
-
Thanks, Martijn. The pages all have self-referencing canonical tags (except for the blog posts which have all non-US English pages referencing the US English version as the canonical page.
I'm going to be safe and implement the HREF Lang tags. Do you think the self-referencing canonical tags on each version of the page are going to cause a problem?
-
Hi Mike,
I definitely wouldn't trust only on using the HTML Lang Tag, as that's something that isn't used a lot by sites in the end. Plus it's a vague indicator to Google that that is the actual language that is being used there. I would go with stating the different pages with the HREF Lang tag and worst case go with a canonical tag implementation.
Martijn.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO: open source e-commerece vs. off the shelf
I'm trying to decide on a web development company. Both would do our CMS with Wordpress, but for the e-commerce platform one would use shopify (off the shelf) and the other would use Woo-Commerce (open source). SEO wise is there any benefit of going one way or the other? I am worried the off the shelf (shopify) would be weaker because it would be hosted on a different server than the CMS, plus I think the url structure would be less flexible through shopify (keywords would be further down url structure). Thanks, Dan
Technical SEO | | dcostigan0 -
Easy Question: regarding no index meta tag vs robot.txt
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
Technical SEO | | Santaur0 -
Categories in Places Vs Local
Say you are listed with both Google places and Google Local. Places still allows custom categories, while Local limits you to preset categories. Which is the better strategy: to build service pages following custom services available in Places, or build out service pages following the (allowed) preset categories in Local.
Technical SEO | | waynekolenchuk0 -
CNAME vs 301 redirect
Hi all, Recently I created a website for a new client and my next job is trying to get them higher in Google. I added them in OSE and noticed some strange backlinks. To my surprise the client has about 20 domain names. All automatically poiting to (showing) the same new mainsite now. www.maindomain.nl www.maindomain.be
Technical SEO | | Houdoe
www.maindomain.eu
www.maindomain.com
www.otherdomain.nl
www.otherdomain.com
... Some of these domains have backlinks too (but not so much). I suggested to 301 redirect them all to the main site. Just to avoid duplicate content. But now the webhoster comes into play: "It's a problem, client has only 1 hosting account, blablabla...". They told me they could CNAME the 20 domains to the main domain. Or A-record them to an IP address. This is too technical stuff for me. So my concrete questions are: Is it smart to do anything at all or am I just harming my client? The main site is ranking pretty well now. And some backlinks are from their copy sites (probably because everywhere the logo links to the full mainsite url). Does the CNAME or A-record solution has the same effect as a 301 redirect, from SEO perspective? Many thanks,
Hans0 -
301 vs 302 & Link Juice
Has any one come across any recent cases of a 302 link passing more link juice than before?
Technical SEO | | CeeC-Blogger0 -
Is there a benefit to Microdata vs. RDFa Lite?
Is there any community consensus about whether Microdata or RDFa Lite is the superior rich-snippet format? I work as a design/front-end-developer and in terms of pure coding, RDFa Lite seems the superior method. It looks to be more flexible and more extensible. The W3C spec is also more mature—it's a W3C Recommendation where Microdata is only a W3C Working Draft—so it's more likely to reach full standardization sooner. Also, because it's a Recommendation it's less likely to change. However, I hear Google "strongly recommends" the use of Microdata. Do they not support RDFa/RDFa Lite? There doesn't seem to be a great deal of discussion on this anywhere so I'm tempted to think it's sort of irrelevant. I am aware that Schema.org is, supposedly, now supporting RDFa Lite.
Technical SEO | | kongregate0 -
Rel=Canonical, WWW vs non WWW and SEO
Okay so I'm a bit of a loss here. For what ever reason just about every single Wordpress site I has will turn www.mysite.com into mysite.com in the browser bar. I assume this is the rel=canonical tag at work, there are no 301s on my site. When I use the Open Site Explorer and type in www.mysite.com it shows a domain authority of around 40 and a few hundred backlinks... and then I get the message. Oh Hey! It looks like that URL redirects to XXXXXX. Would you like to see data for <a class="clickable redirects">that URL instead</a>? So if I click to see this data instead I have less than half of that domain authority and about 2 backlinks. *** Does this make a difference SEO wise? Should my non WWW be redirecting to my WWW instead because that's where the domain authority and backlinks are? Why am I getting two different domain authority and backlink counts if they are essentially the same? Or am I wrong and all that link juice and authority passes just the same?
Technical SEO | | twilightofidols0 -
Submitting Sitemap File vs Sitemap Index File
Is it better to submit all sitemap files contained in a Sitemap Index File manually to Google or is it about the same as just submitting the Master Sitemap Index File.
Technical SEO | | AU-SEO0