Cross Domain duplicate content...
-
Does anyone have any experience with this situation?
We have 2 ecommerce websites that carry 90% of the same products, with mostly duplicate product descriptions across domains. We will be running some tests shortly.
Question 1:
If we deindex a group of product pages on Site A, should we see an increase in ranking for the same products on Site B? I know nothing is certain, just curious to hear your input.
The same 2 domains have different niche authorities. One is healthcare products, the other is general merchandise. We've seen this because different products rank higher on 1 domain or the other. Both sites have the same Moz Domain Authority (42, go figure). We are strongly considering cross domain canonicals.
Question 2
Does niche authority transfer with a cross domain canonical? In other words, for a particular product, will it rank the same on both domains regardless of which direction we canonical? Ex:
Site A: Healthcare Products, Site B: General Merchandise. I have a health product that ranks #15 on site A, and #30 on site B. If I use rel=canonical for this product on site B pointing at the same product on Site A, will the ranking be the same if I use Rel=canonical from Site A to Site B? Again, best guess is fine.
Question 3:
These domains have similar category page structures, URLs, etc, but feature different products for a particular category. Since the pages are different, will cross domain canonicals be honored by Google?
-
If the alternative is just de-indexing those duplicate pages on one website, then I'd definitely recommend the cross-domain canonicals, yes.
-
Brady:
Thanks for your advice. We are de-indexing as a test to see if our rankings are somehow being constrained because of duplicate content. Our rankings are not reacting as they should to our link building efforts, and I believe that duplicate content is the issue. With this test, I am trying to understand how Google sees and connects these 2 sites. If Google does connect the sites, then we must canonical since Google won't let us have 2 slots on page 1 for the same KW. If Google doesn't connect the sites, then we can theoretically get 2 listings on page 1 if our content is unique.
My hypothesis is that our massive duplicate content is having a negative impact on rankings. Google might be hitting us with a minor Panda slap, or the competing content is somehow hurting us algorithmicly. If we deindex a competing page, if we are being hurt by the algo, the remaining page should make a bump up.
I am pretty certain that canonicals will have a positive impact on our rankings. The question I am testing for is "do we have to canonical"? If we don't, then we have a decision to make - do we try to rank both sites for a KW, or canonical and focus on 1.
-
First of all, these are great questions.
My first question would be are the sites hosted on the same server or near-same IP address? If they are, and given much of the content is duplicate, chances are Google/search engines already understand these websites are somehow related. This is just something to consider...
Answer #1: If you de-index a group of products on one website, chances are, yes the other site would see some improvement just based on there being one less competitor. But I would do some extensive competitive research first to see if how the other sites are ranking next to your two sites.
Ultimately, I would side with a cross-domain canonical over de-indexing that way you're passing some value from one site to the other. I would do this on a product by product basis however, making sure the product niche you keep indexed matches with the site's overall niche theme and not vis versa.
Answer #2: My second paragraph sort of addresses your second question. Think from a semantic and topical understanding perspective here: if it's a healthcare product, make sure the site with the healthcare niche is the one being indexed, not just the general merchandise website. Even simply from a branding and general marketing perspective that makes more sense, IMO.
Answer #3: It sounds like, if there's duplicate descriptions (I'm guessing images, headers, and other content pieces) the canonicals likely would be honored. Even across domains, duplicate content can be a concern (especially if they're hosted together). Remember though, canonical tags are just a mere suggestion, so it could take some time for Google to honor it, but from the information you've given, I think that's your best shot.
Another thing to take into consideration when using canonical tag: make sure you're placing the canonical tag on the page/website that's performing worse of the two. There may be exceptions based on the niche and from a semantic perspective, but in general, you don't want to hurt your own performance by referencing the less authoritative page.
Good luck! Hope my advice helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to solve our duplicate content issue? (Possible Session ID problem)
Hi there, We've recently took on a new developer who has no experience in any technical SEO and we're currently redesigning our site www.mrnutcase.com. Our old developer was up to speed on his SEO and any technical issues we never really had to worry about. I'm using Moz as a tool to go through crawl errors on an ad-hoc basis. I've noticed just now that we're recording a huge amount of duplicate content errors ever since the redesign commenced (amongst other errors)! For example, the following page is duplicated 100s of times: https://www.mrnutcase.com/en-US/designer/?CaseID=1128599&CollageID=21&ProductValue=2293 https://www.mrnutcase.com/en-US/designer/?CaseID=1128735&CollageID=21&ProductValue=3387 https://www.mrnutcase.com/en-GB/designer/?CaseID=1128510&CollageID=21&ProductValue=3364 https://www.mrnutcase.com/en-GB/designer/?CaseID=1128511&CollageID=21&ProductValue=3363 etc etc. Does anyone know how I should be dealing with this problem? And is this something that needs to be fixed urgently? This problem has never happened before so i'm hoping it's an easy enough fix. Look forward to your responses and greatly appreciate the help. Many thanks, Danny
Intermediate & Advanced SEO | | DannyNutcase0 -
Duplicate content issues from mirror subdomain : facebook.domianname.com
Hey Guys,
Intermediate & Advanced SEO | | b2bmarketer
Need your suggestions.
I have got a website that has duplicate content issue.
a sub-domain called facebook.asherstrategies .com comes from no where and is getting indexed.
Website Link : asherstrategies .com
subdomain link: facebook.asherstrategies .com This sub domain is actually a mirror of the website and i have no idea how is is created.
trying to resolve the issue but could not find the clue.0 -
Duplicate content on subdomains
Hi All, The structure of the main website goes by http://abc.com/state/city/publication - We have a partnership with public libraries to give local users access to the publication content for free. We have over 100 subdomains (each for an specific library) that have duplicate content issues with the root domain, Most subdomains have very high page authority (the main public library and other local .gov websites have links to this subdomains).Currently this subdomains are not index due to the robots text file excluding bots from crawling. I am in the process of setting canonical tags on each subdomain and open the robots text file. Should I set the canonical tag on each subdomain (homepage) to the root domain version or to the specific city within the root domain? Example 1:
Intermediate & Advanced SEO | | NewspaperArchive
Option 1: http://covina.abc.com/ = Canonical Tag = http://abc.com/us/california/covina/
Option 2: http://covina.abc.com/ = Canonical Tag = http://abc.com/ Example 2:
Option 1: http://galveston.abc.com/ = Canonical Tag = http://abc.com/us/texas/galveston/
Option 2: http://galveston.abc.com = Canonical Tag = http://abc.com/ Example 3:
Option 1: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/us/kansas/hutchinson/
Option 2: http://hutchnews.abc.com/ = Canonical Tag = http://abc.com/ I believe it makes more sense to set the canonical tag to the corresponding city (option 1), but wondering if setting the canonical tag to the root domain will pass "some link juice" to the root domain and it will be more beneficial. Thanks!0 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
Get Duplicate Page content for same page with different extension ?
I have added a campaign like "Bannerbuzz" in SEOMOZ Pro account and before 2 or 3 days i got errors related to duplicate page content . they are showing me same page with different extension. As i mentioned below http://www.bannerbuzz.com/outdoor-vinyl-banners.html
Intermediate & Advanced SEO | | CommercePundit
&
http://www.bannerbuzz.com/outdoor_vinyl_banner.php We checked our whole source files but we didn't define php related urls in our source code. we want to catch only our .html related urls. so, Can you please guide us to solve this issue ? Thanks <colgroup><col width="857"></colgroup>
| http://www.bannerbuzz.com/outdoor-vinyl-banners.html |0 -
Duplicate content for area listings
Hi, I was slightly affected by the panda update on the 14th oct generaly dropping by about 5-8 spots in the serps for my main keywords, since then I've been giving my site a good looking over. On a site I've got city listings urls for certain widget companys, the thing is many areas and thus urls will have the same company listed. What would be the best way of solving this duplicate content as google may be seeing it? I was thinking of one page per company and prominenly listing the areas they operate so still hopefully get ranked for area searches. But i'd be losing the city names in the url as I've got them now for example: mywidgetsite.com/findmagicwidgets/new-york.html mywidgetsite.com/findmagicwidgets/atlanta.html Any ideas on how best to proceed? Cheers!
Intermediate & Advanced SEO | | NetGeek0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0