Canonical vs Alternate for country based subdomain dupe content?
-
What's the correct method for tagging dupe content between country based subdomains?
We have:
mydomain.com // default, en-us
www.mydomain.com // en-us
uk.mydomain.com // uk, en-gb
au.mydomain.com // australia, en-au
eu.mydomain.com // europe, en-eu
In the header of each we currently have rel="alternate" tags but we're still getting dupe content warnings in Moz for the "WWW" subdomain.
Question 1) Are we headed in the right direction with using alternate? Or would it be better to use canonical since the languages are technically all English, just different regions. The content is pretty much the same minus currency and localization differences.
Question 2) How can we solve the dupe content between WWW and the base domain, since the above isn't working.
Thanks so much
-
Yes.
-
Thanks.
So then I am safe when including all of these on every subdomain?
I have a common header where the above is the exact same for every subdomain (all 4 are always included), which I assume is the correct way?
Also: Why doesn't Moz look at the hreflang tag? I'm very worried about just "ignoring" what the tool says... why is the top SEO tool in the world not capable of correctly detecting dupe content? I'm not sure I'm comfortable with just ignoring the check engine light, so to speak.
-
In cases like yours, using the hreflang is the correct way to handle the duplicate content issue, because of the characteristics you yourself cite: currency and localization, which may be tiny differences in terms of "content" but huge in terms of usability and making completely different a product page from another.
Remember that if you canonicalize all the "duplicate" toward the canonical, the canonicalized URLs won't be shown in the countries you're targeting with those URLs... so screwing up the international SEO strategy 100%, so each URL must have as canonical its own URL (self referential), apart the obvious canonicalization rules being applied (e.g.: url with parameter canonicalized to url without parameter).
In case the URL is canonicalized for whatever reason, remember to indicate the canonical URLs in the href of the hreflang annotations. On the contrary Google will start alerting of no-return URLs errors.
Regarding the Moz Pro crawler... don't pay attention to it, because it doesn't consider the hreflang annotation,therefore it will continue saying that those pages are duplicate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Invert canonicals?
Hi, We have 2 sites, site A and site B. For now, some of our articles are duplicated on site B with rel canonicals towards site A. Starting now, Site B will be the main site for this category, we'll only post the content on this site. We will keep the old content on site A. But what do you think will happen if we invert the canonicals for the old articles? They would go towards site B. Would google eventually update its index, a bit like it would do for a redirect? Thanks !
Technical SEO | | AdrienLargus0 -
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
Canonical tag or 301
Hi, Our crawl report is showing duplicate content. some of the report I am clear about what to do but on others I am not. Some of the duplicate content arises with a 'theme=default' on the end of the URL. Is this version of a page necessary for people to see when they visit the site (like a theme=print page is) in which case I think we should use a canonical tag, or is it not necessary in which case we should use a 301? Thanks
Technical SEO | | Houses0 -
Duplicate page content
hi I am getting an duplicate content error in SEOMoz on one of my websites it shows http://www.exampledomain.co.uk http://www.exampledomain.co.uk/ http://www.exampledomain.co.uk/index.html how can i fix this? thanks darren
Technical SEO | | Bristolweb0 -
I am Posting an article on my site and another site has asked to use the same article - Is this a duplicate content issue with google if i am the creator of the content and will it penalize our sites - or one more than the other??
I operate an ecommerce site for outdoor gear and was invited to guest post on a popular blog (not my site) for a trip i had been on. I wrote the aritcle for them and i also will post this same article on my website. Is this a dup content problem with google? and or the other site? Any Help. Also if i wanted to post this same article to 1 or 2 other blogs as long as they link back to me as the author of the article
Technical SEO | | isle_surf0 -
Is this dangerous (a content question)
Hi I am building a new shop with unique products but I also want to offer tips and articles on the same topic as the products (fishing). I think if was to add the articles and advice one piece at a time it would look very empty and give little reason to come back very often. The plan, therefore, is to launch the site pulling articles from a number of article websites - with the site's permission. Obviously this would be 100% duplicate content but it would make the user experience much better and offer added value to my site as people are likely to keep returning even when not in the mood to purchase anything; it also offers the potential for people to email links to friends etc. note: over time we will be adding more unique content and slowly turning off the pulled articled. Anyway, from an seo point of view I know the duplicate content would harm the site but if I was to tell google not to index the directory and block it from even crawling the directory would it still know there is duplicate content on the site and apply the penalty to the non duplicate pages? I'm guessing no but always worth a second opinion. Thanks Carl
Technical SEO | | Grumpy_Carl0 -
Duplicate Content Issue
Hi Everyone, I ran into a problem I didn't know I had (Thanks to the seomoz tool) regarding duplicate content. my site is oxford ms homes.net and when I built the site, the web developer used php to build it. After he was done I saw that the URL's looking like this "/blake_listings.php?page=0" and I wanted them like this "/blakes-listings" He changed them with no problem and he did the same with all 300 pages or so that I have on the site. I just found using the crawl diagnostics tool that I have like 3,000 duplicate content issues. Is there an easy fix to this at all or does he have to go in and 301 Redirect EVERY SINGLE URL? Thanks for any help you can give.
Technical SEO | | blake-766240 -
Magento and Duplicate content
I have been working with Magento over the last few weeks and I am becoming increasingly frustrated with the way it is setup. If you go to a product page and remove the sub folders one by one you can reach the same product pages causing duplicate content. All magento sites seem to have this weakness. So use this site as an example because I know it is built on magento, http://www.gio-goi.com/men/clothing/tees/throve-t-short.html?cid=756 As you remove the tees then the clothing and men sub folders you can still reach the product page. My first querstion is how big an issue is this and two does anyone have any ideas of how to solve it? Also I was wondering how does google treat question marks in urls? Should you try and avoid them unless you are filtering? Thanks
Technical SEO | | gregster10001