International SEO - cannibalisation and duplicate content
-
Hello all,
I look after (in house) 3 domains for one niche travel business across three TLDs: .com .com.au and co.uk and a fourth domain on a co.nz TLD which was recently removed from Googles index.
Symptoms:
For the past 12 months we have been experiencing canibalisation in the SERPs (namely .com.au being rendered in .com) and Panda related ranking devaluations between our .com site and com.au site.
Around 12 months ago the .com TLD was hit hard (80% drop in target KWs) by Panda (probably) and we began to action the below changes. Around 6 weeks ago our .com TLD saw big overnight increases in rankings (to date a 70% averaged increase). However, almost to the same percentage we saw in the .com TLD we suffered significant drops in our .com.au rankings. Basically Google seemed to switch its attention from .com TLD to the .com.au TLD.
Note: Each TLD is over 6 years old, we've never proactively gone after links (Penguin) and have always aimed for quality in an often spammy industry.
**Have done: **
- Adding HREF LANG markup to all pages on all domain
- Each TLD uses local vernacular e.g for the .com site is American
- Each TLD has pricing in the regional currency
- Each TLD has details of the respective local offices, the copy references the lacation, we have significant press coverage in each country like The Guardian for our .co.uk site and Sydney Morning Herlad for our Australia site
- Targeting each site to its respective market in WMT
- Each TLDs core-pages (within 3 clicks of the primary nav) are 100% unique
- We're continuing to re-write and publish unique content to each TLD on a weekly basis
- As the .co.nz site drove such little traffic re-wrting we added no-idex and the TLD has almost compelte dissapread (16% of pages remain) from the SERPs.
- XML sitemaps
- Google + profile for each TLD
**Have not done: **
- Hosted each TLD on a local server
- Around 600 pages per TLD are duplicated across all TLDs (roughly 50% of all content). These are way down the IA but still duplicated.
- Images/video sources from local servers
- Added address and contact details using SCHEMA markup
Any help, advice or just validation on this subject would be appreciated!
Kian
-
Hey Simon,
The Australia is lang="en-au"
The UK is lang="en-gb"
The US is lang="en-us"
We've tried to keep these as tight per country as possible so opted not to use the straight 'en'.
In analytics, there has been some reduction is language referrals, mainly "en-gb" falling from the number one language type for the US site, which is a positive. Interstingly enough, once we removed the .co.nz fro mthe Index the .com site remove in to dominate the SERPs for brand and some core-KW searches in Google.co.nz.
Its a little unfortunate as Panda, from my understanding, is keen to spare ccTLDs from any harsh devaluations, but we'll hopefully be able to hit whatever threshold for % of unique content in the near future.
We have review functionality planned for each TLD which should help add value to existing duplicate content. Once this is up and I have some more robust data I'll pull a post together for YouMoz.
Thanks for the feedback!
Kian
-
Wow, that's a pretty comprehensive list of actions you've compiled there and you seem to have covered pretty much all the bases. I almost think your post should be promoted on Youmoz as a great step of actions for targeting regional websites.
My experience of hreflang is that it is not perfect in that you occasionally get the wrong versions of pages served in SERPs. I wonder do you specify the .com as 'en' in the hreflang mark up in order that it is the generic English language version as opposed to being country specific?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Getting rid of duplicate content remaining from old misconfiguration
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
Intermediate & Advanced SEO | | pissuptours0 -
How do I optimize dynamic content for SEO?
Hello, folks! I'm wondering how I optimize a site if it is built on a platform that works based on dynamic content. For example, the page pulls in certain information based on the information it has about the user. Not every user will see the same page. Thanks!
Intermediate & Advanced SEO | | Geonetric
Lindsey0 -
Duplicate content due to parked domains
I have a main ecommerce website with unique content and decent back links. I had few domains parked on the main website as well specific product pages. These domains had some type in traffic. Some where exact product names. So main main website www.maindomain.com had domain1.com , domain2.com parked on it. Also had domian3.com parked on www.maindomain.com/product1. This caused lot of duplicate content issues. 12 months back, all the parked domains were changed to 301 redirects. I also added all the domains to google webmaster tools. Then removed main directory from google index. Now realize few of the additional domains are indexed and causing duplicate content. My question is what other steps can I take to avoid the duplicate content for my my website 1. Provide change of address in Google search console. Is there any downside in providing change of address pointing to a website? Also domains pointing to a specific url , cannot provide change of address 2. Provide a remove page from google index request in Google search console. It is temporary and last 6 months. Even if the pages are removed from Google index, would google still see them duplicates? 3. Ask google to fetch each url under other domains and submit to google index. This would hopefully remove the urls under domain1.com and doamin2.com eventually due to 301 redirects. 4. Add canonical urls for all pages in the main site. so google will eventually remove content from doman1 and domain2.com due to canonical links. This wil take time for google to update their index 5. Point these domains elsewhere to remove duplicate contents eventually. But it will take time for google to update their index with new non duplicate content. Which of these options are best best to my issue and which ones are potentially dangerous? I would rather not to point these domains elsewhere. Any feedback would be greatly appreciated.
Intermediate & Advanced SEO | | ajiabs0 -
How can a website have multiple pages of duplicate content - still rank?
Can you have a website with multiple pages of the exact same copy, (being different locations of a franchise business), and still be able to rank for each individual franchise? Is that possible?
Intermediate & Advanced SEO | | OhYeahSteve0 -
Robots.txt & Duplicate Content
In reviewing my crawl results I have 5666 pages of duplicate content. I believe this is because many of the indexed pages are just different ways to get to the same content. There is one primary culprit. It's a series of URL's related to CatalogSearch - for example; http://www.careerbags.com/catalogsearch/result/index/?q=Mobile I have 10074 of those links indexed according to my MOZ crawl. Of those 5349 are tagged as duplicate content. Another 4725 are not. Here are some additional sample links: http://www.careerbags.com/catalogsearch/result/index/?dir=desc&order=relevance&p=2&q=Amy
Intermediate & Advanced SEO | | Careerbags
http://www.careerbags.com/catalogsearch/result/index/?color=28&q=bellemonde
http://www.careerbags.com/catalogsearch/result/index/?cat=9&color=241&dir=asc&order=relevance&q=baggallini All of these links are just different ways of searching through our product catalog. My question is should we disallow - catalogsearch via the robots file? Are these links doing more harm than good?0 -
News section of the website (Duplicate Content)
Hi Mozers One of our client wanted to add a NEWS section in to their website. Where they want to share the latest industry news from other news websites. I tried my maximum to understand them about the duplicate content issues. But they want it badly What I am planning is to add rel=canonical from each single news post to the main source websites ie, What you guys think? Does that affect us in any ways?
Intermediate & Advanced SEO | | riyas_heych0 -
Duplicate content for images
On SEOmoz I am getting duplicate errors for my onsite report. Unfortunately it does not specify what that content is... We are getting these errors for our photo gallery and i am assuming that the reason is some of the photos are listed in multiple categories. Can this be the problem? what else can it be? how can we resolve these issues?
Intermediate & Advanced SEO | | SEODinosaur0 -
Pop Up Pages Being Indexed, Seen As Duplicate Content
I offer users the opportunity to email and embed images from my website. (See this page http://www.andertoons.com/cartoon/6246/ and look under the large image for "Email to a Friend" and "Get Embed HTML" links.) But I'm seeing the ensuing pop-up pages (Ex: http://www.andertoons.com/embed/5231/?KeepThis=true&TB_iframe=true&height=370&width=700&modal=true and http://www.andertoons.com/email/6246/?KeepThis=true&TB_iframe=true&height=432&width=700&modal=true) showing up in Google. Even worse, I think they're seen as duplicate content. How should I deal with this?
Intermediate & Advanced SEO | | andertoons0