Canonical vs Alternate for country based subdomain dupe content?
-
What's the correct method for tagging dupe content between country based subdomains?
We have:
mydomain.com // default, en-us
www.mydomain.com // en-us
uk.mydomain.com // uk, en-gb
au.mydomain.com // australia, en-au
eu.mydomain.com // europe, en-eu
In the header of each we currently have rel="alternate" tags but we're still getting dupe content warnings in Moz for the "WWW" subdomain.
Question 1) Are we headed in the right direction with using alternate? Or would it be better to use canonical since the languages are technically all English, just different regions. The content is pretty much the same minus currency and localization differences.
Question 2) How can we solve the dupe content between WWW and the base domain, since the above isn't working.
Thanks so much
-
Yes.
-
Thanks.
So then I am safe when including all of these on every subdomain?
I have a common header where the above is the exact same for every subdomain (all 4 are always included), which I assume is the correct way?
Also: Why doesn't Moz look at the hreflang tag? I'm very worried about just "ignoring" what the tool says... why is the top SEO tool in the world not capable of correctly detecting dupe content? I'm not sure I'm comfortable with just ignoring the check engine light, so to speak.
-
In cases like yours, using the hreflang is the correct way to handle the duplicate content issue, because of the characteristics you yourself cite: currency and localization, which may be tiny differences in terms of "content" but huge in terms of usability and making completely different a product page from another.
Remember that if you canonicalize all the "duplicate" toward the canonical, the canonicalized URLs won't be shown in the countries you're targeting with those URLs... so screwing up the international SEO strategy 100%, so each URL must have as canonical its own URL (self referential), apart the obvious canonicalization rules being applied (e.g.: url with parameter canonicalized to url without parameter).
In case the URL is canonicalized for whatever reason, remember to indicate the canonical URLs in the href of the hreflang annotations. On the contrary Google will start alerting of no-return URLs errors.
Regarding the Moz Pro crawler... don't pay attention to it, because it doesn't consider the hreflang annotation,therefore it will continue saying that those pages are duplicate.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unknown Subdomains Ranking
In spot checking some pages that I recently launched, I found subdomains ranking in place of the domain. The strange thing is, we never set up these sub-domains and they don't exist on our server. The pages, though they're indexed with the proper title and meta-description, time out when clicked. We're operating on Drupal 7, pages launched at the beginning of the month. The other pages within the series of content are ranking properly. Any thoughts or tips to resolve this?
Technical SEO | | JordanNCU1 -
What about Panoramic content ?
Hello everyone ,, We have a website include a panoramic images for many pages this panorama is really unique and we did a hard work to collect it , we thought that will be very useful for our target audience !! We have tried to search about how to make a panoramic content working and support the SEO , Unfortunately NO result and NO information yet, _Could you help us in that filed _ _Thanks _
Technical SEO | | Visual-ex0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Canonical needed after no index
Hi do you need to point canonical from a subpage to main page if you have already marked a no index on the subpage, like when google is not indexing it so do we need canonicals now as is it passing any juice?
Technical SEO | | razasaeed0 -
Duplicate Content Issue
SEOMOZ is giving me a number of duplicate content warnings related to pages that have an email a friend and/or email when back in stock versions of a page. I thought I had those blocked via my robots.txt file which contains the following... Disallow: /EmailaFriend.asp Disallow: /Email_Me_When_Back_In_Stock.asp I had thought that the robot.txt file would solve this issue. Anyone have any ideas?
Technical SEO | | WaterSkis.com0 -
Location Based Content / Googlebot
Our website has local content specialized to specific cities and states. The url structure of this content is as follows: www.root.com/seattle www.root.com/washington When a user comes to a page, we are auto-detecting their IP and sending them directly to the relevant location based page - much the way that Yelp does. Unfortunately, what appears to be occurring is that Google comes in to our site from one of its data centers such as San Jose and is being routed to the San Jose page. When a user does a search for relevant keywords, in the SERPS they are being sent to the location pages that it appears that bots are coming in from. If we turn off the auto geo, we think that Google might crawl our site better, but users would then be show less relevant content on landing. What's the win/win situation here? Also - we also appear to have some odd location/destination pages ranking high in the SERPS. In other words, locations that don't appear to be from one of Google's data center. No idea why this might be happening. Suggestions?
Technical SEO | | Allstar0 -
I am trying to correct error report of duplicate page content. However I am unable to find in over 100 blogs the page which contains similar content to the page SEOmoz reported as having similar content is my only option to just dlete the blog page?
I am trying to correct duplicate content. However SEOmoz only reports and shows the page of duplicate content. I have 5 years worth of blogs and cannot find the duplicate page. Is my only option to just delete the page to improve my rankings. Brooke
Technical SEO | | wianno1680 -
Forget Duplicate Content, What to do With Very Similar Content?
All, I operate a Wordpress blog site that focuses on one specific area of the law. Our contributors are attorneys from across the country who write about our niche topic. I've done away with syndicated posts, but we still have numerous articles addressing many of the same issues/topics. In some cases 15 posts might address the same issue. The content isn't duplicate but it is very similar, outlining the same rules of law etc. I've had an SEO I trust tell me I should 301 some of the similar posts to one authoritative post on the subject. Is this a good idea? Would I be better served implementing canonical tags pointing to the "best of breed" on each subject? Or would I be better off being grateful that I receive original content on my niche topic and not doing anything? Would really appreciate some feedback. John
Technical SEO | | JSOC0