Reinforcing Rel Canonical? (Fixing Duplicate Content)
-
Hi Mozzers,
We're having trouble with duplicate content between two sites, so we're looking to add some oomph to the rel canonical link elements we put on one of our sites pointing towards the other to help speed up the process and give Google a bigger hint.
Would adding a hyperlink on the "copying" website pointing towards the "original" website speed this process up?
Would we get in trouble if added about 80,000 links (1 on each product page) with a link to the matching product on the other site? For example, we could use text like "Buy XY product on Other Brand Name and receive 10% off!"
-
Have you seen a corresponding drop-off in the ListFinder pages over that time. If the canonical is kicking in, you should see some of those pages fall out as more ConsumerBase pages kick in.
Is there a reason your canonical'ing from the more indexed site to the less indexed one. It could be a mixed signal if Google things that ListFinder is a more powerful or authoritative site. Cross-domain can get tricky fast.
Unfortunately, beyond NOINDEX'ing, it's about your best option, and certainly one of your safest. It's really hard to predict what the combo of cross-domain canonical plus link would do. From a dupe content standpoint, it's risk free. From the standpoint of creating 80K links from one of your sites to another of your sites, it's a little risky (don't want to look like a link network). Since you're only talking two sites, though, it's probably not a huge issue, especially with the canonical already in place.
Google interprets cross-domain canonical heavily, so it can be a little hard to predict and control. Interestingly, the ConsumerBase site has higher Domain Authority, but the page you provided has lower Page Authority than its "sister" page. Might be a result of your internal linking structure giving more power to the ListFinder pages.
-
Great post Peter.
Here are some links of a product that is on both sites. Hopefully this will help you provide some more insight.
http://www.consumerbase.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.html
http://www.listfinder.com/mailing-lists/shutterbugsphotography-enthusiasts-mailing-list.htmlThe ListFinder pages are currently mostly indexed (70k out of 80k) which makes me think they are different enough from one another to not warrant a penalty.
The ConsumerBase pages started indexing well when we added the rel canonical code to LF (went from about 2k pages to 30k in early December, but since 1/2/2013 we have seen a dropoff in indexed pages down to about 5k.
Thanks!
-
With products, it's a bit hard to say. Cross-domain canonical could work, but Google can be a bit finicky about it. Are you seeing the pages on both sides in the Google index, or just one or the other? Sorry, it's a bit hard to diagnose without seeing a sample URL.
If this were more traditional syndicated content, you could set a cross-domain canonical and link the copy back to the source. That would provide an additional signal of which site should get credit. With your case, though, I haven't seen a good example of that - I don't think it would be harmful, though (to add the link, that is).
If you're talking about 80K links, then you've got 80K+ near-duplicate product pages. Unfortunately, it could go beyond just having one or the other version get filtered out. This could trigger a Panda or Panda-like penalty against the site in general. The cross-domain canonical should help prevent this, whereas the links probably won't. I do think it's smart to be proactive, though.
Worst case, you could META NOINDEX the product pages on one site - they'd still be available to users, but wouldn't rank. I think the cross-domain canonical is probably preferable here, but if you ran into trouble, META NOINDEX would be the more severe approach (and could help solve that trouble).
-
Yes, sir - that would be correct.
www.consumerbase.com and www.listfinder.com.
The sites are not 100% identical, just the content on the product pages.
-
are these two sites on the same root domain? it seems like most of the feedback you're getting are from people who are assuming they are however, it sounds to me like there are two separate domains
-
Zora,
Google accepts cross domain canonical as long as the pages have more similar content.
It is not necessary to add hyperlink pointing to canonical page. If your sites are crawler friendly, canonical hints will change search results very quickly.
http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769
Ensure that Google doesn't find any issue with your Sitemaps. If you add products frequently, submit the updated Sitemap following the same schedule.
All the best.
-
I am sorry i am not understanding why you need a rel = in this matter if the sites are two different sites?
What is your end goal ?
-
We chose rel canonical because we still want users to be able to visit and navigate through site 2.
They are both e-commerce sites with similar products, not exactly identical sites.
-
Zora. Totally understand, but my input and what Majority of people do is redirect the traffic.
A server side htaccess 301 Redirect is your BEST choice here.
Why dont you want o use a 301 and prefer a Rel, curious on what your take is on this.
and Thanks for the rel update info i didnt know
-
Thanks for the info Hampig, I'll definitely take a look.
Rel Canonical actually works cross domain now, Google updated it from when it originally came out.
-
Zora hope you are doing well.
I came across this video about a few weeks ago. I think this is suppose to be found under Webmaster tools although i have not used it, i think it might be the best solution to get googles attention to portions of the pages and what they are suppose to be
http://www.youtube.com/watch?v=WrEJds3QeTw
Ok but i am confused a bit. You have two different domains ?
or two version of the same domain?
Because from the sound of it you have two different domains and using rel = con wont work and you would have to do a 301 redirect. Even for my sites when i change the pages around i use 301 redirect for the same existing site.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content across domains?
Does anyone have suggestions for managing duplicate product/solution website content across domains? (specifically parent/child company domains) Is it advisable to do this? Will it hurt either domain? Any best practices when going down this path?
Intermediate & Advanced SEO | | pilgrimquality0 -
Search console, duplicate content and Moz
Hi, Working on a site that has duplicate content in the following manner: http://domain.com/content
Intermediate & Advanced SEO | | paulneuteboom
http://www.domain.com/content Question: would telling search console to treat one of them as the primary site also stop Moz from seeing this as duplicate content? Thanks in advance, Best, Paul. http0 -
Canonical Vs No Follow for Duplicate Products
I am in the process of migrating a site from Volusion to BigCommerce. There is a limitation on the ability to display one product in 2 different ways. Here is the situation. One of the manufacturers will not allow us to display products to customers who are not logged in. We have convinced them to let us display the products with no prices. Then we created an Exclusive Contractor section that will allow users to see the price and be able to purchase the products online. Originally we were going to just direct users to call to make purchases like our competitors are doing. Because we have a large amount of purchasers online we wanted to manipulate the system to be able to allow online purchases. Since these products will have duplicates with no pricing I was thinking that Canonical tags would be kind of best practice. However, everything will be behind a firewall with a message directing people to log in. Since this will undoubtedly create a high bounce rate I feel like I need to no follow those links. This is a rather large site, over 5000 pages. The 250 no follow URLs most likely won't have a large impact on the overall performance of the site. Or so I hope anyway. My gut tells me if these products are going to technically be hidden from the searcher they should also be hidden from the engines. Does Disallowing these URLs seem like a better way to do this than simply using the Canonical tags? Any thoughts or suggestions would be really helpful!
Intermediate & Advanced SEO | | MonicaOConnor0 -
Canonical Rel .uk and .au to .com site?
Hi guys, we have a client whose main site is .com but who has a .co.uk and a com.au site promoting the same company/brand. Each site is verified locally with a local address and phone but when we create content for the sites that is universal, should I rel=canonical those pages on the .co.uk and .com.au sites to the .com site? I saw a post from Dr. Pete that suggests I should as he outlines pretty closely the situation we're in: "The ideal use of cross-domain rel=canonical would be a situation where multiple sites owned by the same entity share content, and that content is useful to the users of each individual site." Thanks in advance for your insight!
Intermediate & Advanced SEO | | wcbuckner0 -
Duplicate Content From Indexing of non- File Extension Page
Google somehow has indexed a page of mine without the .html extension. so they indexed www.samplepage.com/page, so I am showing duplicate content because Google also see's www.samplepage.com/page.html How can I force google or bing or whoever to only index and see the page including the .html extension? I know people are saying not to use the file extension on pages, but I want to, so please anybody...HELP!!!
Intermediate & Advanced SEO | | WebbyNabler0 -
Cross Domain Rel Canonical for Affiliates?
Hi We use the Cross Domain Rel Canonical for duplicate content between our own websites, but what about affiliates sites who want our XML feed, (descriptions of our products). We don´t mind being credited but would this present a danger for us? Who is controlling the use of that cross domain rel canonical, us in our feed or them? Is there another way around it?
Intermediate & Advanced SEO | | xoffie0 -
Help With Preferred Domain Settings, 301 and Duplicate Content
I've seen some good threads developed on this topic in the Q&A archives, but feel this topic deserves a fresh perspective as many of the discussion were almost 4 years old. My webmaster tools preferred domain setting is currently non www. I didn't set the preferred domain this way, it was like this when I first started using WM tools. However, I have built the majority of my links with the www, which I've always viewed as part of the web address. When I put my site into an SEO Moz campaign it recognized the www version as a subdomain which I thought was strange, but now I realize it's due to the www vs. non www preferred domain distinction. A look at site:mysite.com shows that Google is indexing both the www and non www version of the site. My site appears healthy in terms of traffic, but my sense is that a few technical SEO items are holding me back from a breakthrough. QUESTION to the SEOmoz community: What the hell should I do? Change the preferred domain settings? 301 redirect from non www domain to the www domain? Google suggests this: "Once you've set your preferred domain, you may want to use a 301 redirect to redirect traffic from your non-preferred domain, so that other search engines and visitors know which version you prefer." Any insight would be greatly appreciated.
Intermediate & Advanced SEO | | JSOC1 -
Mobile version creating duplicate content
Hi We have a mobile site which is a subfolder within our site. Therefore our desktop site is www.mysite.com and the mobile version is www.mysite.com/m/. All URL's for specific pages are the same with the exception of /m/ in them for the mobile version. The mobile version has the specific user agent detection capabilities. I never saw this as being duplicate content initially as I did some research and found the following links
Intermediate & Advanced SEO | | peterkn
http://www.youtube.com/watch?v=mY9h3G8Lv4k
http://searchengineland.com/dont-penalize-yourself-mobile-sites-are-not-duplicate-content-40380
http://www.seroundtable.com/archives/022109.html What I am finding now is that when I look into Google Webmaster Tools, Google shows that there are 2 pages with the same Page title and therefore Im concerned if Google sees this as duplicate content. The reason why the page title and meta description is the same is simply because the content on the 2 verrsions are the exact same. Only layout changes due to handheld specific browsing. Are there any speficific precausions I could take or best practices to ensure that Google does not see the mobile pages as duplicates of the desktop pages Does anyone know solid best practices to achieve maximum results for running an idential mobile version of your main site?1