Cross-Domain Canonical and duplicate content
-
Hi Mozfans!
I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
The thing is that the client has about 3 sites with the same Jobs on it.I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why.
Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A).Thanks!
Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday -
Every document I have seen all agrees that canonical tags are followed when the tag is used appropriately.
The tag could be misused either intentionally or unintentionally in which case it would not be honored. The tag is meant to connect pages which offer identical information, very similar information, or the same information presented in a different format such as a modified sort order, or a print version. I have never seen nor even heard of an instance where a properly used canonical tag was not respected by Google or Bing.
-
Thanks Ryan, I didn't noticed that about the reply sequencing, and you're right, I read them in the wrong order. It makes much more sense now.
By "some" support, I meant that even Google via Matt Cutts says that they don't take cross domain canonical as "a directive" but rather a "hint" (and even that assumes Google agrees with you, that your pages are duplicates).
So the magic question is how how much authority do Bing and Google give the rel="canonical" and is it similar between the two engines?
-
One aspect of the SEOmoz Q&A structure I dislike is the ordering of responses. Rather then maintaining a timeline order, the responses are re-ordered based on other factors such as "thumbs-up" and staff endorsements. I understand the concept that replies which are liked more are probably more helpful and should be seen first, but it causes confusion such as in this case.
Dr. Pete's response on the Bing cross-canonical topic appears first, but it was offered second-to-last chronologically speaking. We originally agreed there was not evidence indicating Bing supported the cross-canonical tag, then he located such evidence and therefore we agree Bing does support the tag.
The statement Dr. Pete shared was that "Bing does support cross-domain canonical". There was no limiting factor. I mention this because you said they offered "some" support and I am not sure why you used that qualifier.
-
Ryan, at the end o the thread you linked to, it seems like both Dr. Pete and yourself, agreed that there wasn't much evidence of bing support. Have you learned something that changed your mind?
I know a rep from Bing told Dr. Pete there was "some" support, but what does that mean? i.e. Exactly Identical sites pass a little juice/authority, or similar sites pass **a lot **juice/authority?
Take a product that has different brands in different parts of the country. Hellmanns's and Best Foods for example. They have two sites which are the same except for logos. Here is a recipe from each site.
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
The sites are nearly identical except for logo's/product names.
For the (very) long tail keyword "Mayonnaise Bobby Flay Waldorf salad wrap" Best Foods ranks #5 and Hellmann's ranks #11.
I doubt they have a SEO looking very close at the sites, because in addition to their duplicate content problem, neither pages has a meta description.
If the Hellmanns page had a
[http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1](http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1)"/>
I'd expect to see the Best Foods page move up and Hellmanns move down in Google. But would Bing appears to not like the duplicate pages as much, currently the Best Food version ranks #12 and the Hellmann doesn't rank at all. My own (imperfect tests) lead me to believe that adding the rel="canonical" would help in google but not bing.
Obviously, the site owner would probably like one of those two pages to rank very high for the unbranded keyword, but they would want both pages to rank well if I added a branded term. My experience with cross-domain canonical in Google lead me to believe that even the non-canonical version would rank for branded keywords in Google, but what would Bing do?
I'd be very cautious about relying on the cross-domain canonical in Bing until I see some PUBIC announcement that it's supported. ```
-
I was bit confused when i read that. You put my mind to rest !
-
My apologies Atul. I am not sure what I was thinking when I wrote that. Please disregard.
-
Thanks Ryan!
So it will be a Canonical tag
-
I would advise NOT using the robots.txt file if at all possible. In general, the robots.txt file is a means of absolute last resort. The main reason I use the robots.txt file is because I am working with a CMS or shopping cart that does not have the SEO flexibility to noindex pages. Otherwise, the best robots.txt file is a blank one.
When you block a page in robots.txt, you are not only preventing content from being indexed, but you are blocking the natural flow of page rank throughout your site. The link juice which flows to the blocked page dies on the page as crawlers cannot access it.
-
That is correct. If you choose to read the information directly from Google it can be found here:
-
Thanks!
It's for a site in the Netherlands and google is about 98% of the market. Bing is comming up so a thing to check.
No-roboting is a way to do it i didn't think about! thanks for that. I will check with the client.
-
Thanks Ryan!
So link is like:
On the site a i will use the canonical to point everything to site A.
-
You mean rel=author on site A ? How does it help ? Where should rel=author points to ?
-
According to Dr. Pete Bing does support cross-domain canonical.
If you disagreed I would first recommend using rel=author to establish "Site A" was the source of the article.
-
A cross-domain canonical will help with Google. (make sure the pages truely are duplicate or very close), however, I haven't found any confirmation yet that Bing supports Cross Domain Canonical.
If the other sites don't need to rank at all, you could also consider no-roboting the job pages on the other sites, so that your only Site A's job listings get indexed.
-
Yes. A cross-domain canonical would solve the duplicate content issue and focus on the main site's ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Defining duplicate content
If you have the same sentences or paragraphs on multiple pages of your website, is this considered duplicate content and will it hurt SEO?
Intermediate & Advanced SEO | | mnapier120 -
Website Redesign - Duplicate Content?
I hired a company to redesign our website.there are many pages like the example below that we are downsizing content by 80%.(believe me, not my decision)Current page: https://servicechampions.com/air-conditioning/New page (on test server):https://servicechampions.mymwpdesign.com/air-conditioning/My question to you is, that 80% of content that i am losing in the redesign, can i republish it as a blog?I know that google has it indexed. The old page has been live for 5 years, but now 80% of it will no longer be live. so can it be a blog and gain new (keep) seo value?What should i do with the 80% of content i am losing?
Intermediate & Advanced SEO | | CamiloSC0 -
Increase in duplicate page titles due to canonical tag issue
Implemented canonical tag (months back) in product pages to avoid duplicate content issue. But Google picks up the URL variations and increases duplicate page title errors in Search Console. Original URL: www.example.com/first-product-name-123456 Canonical tag: Variation 1: www.example.com/first-product--name-123456 Canonical tag: Variation 2: www.example.com/first-product-name-sync-123456 Canonical tag: Kindly advice the right solution to fix the issue.
Intermediate & Advanced SEO | | SDdigital0 -
Handling duplicate content, whilst making both rank well
Hey MOZperts, I run a marketplace called Zibbet.com and we have 1000s of individual stores within our marketplace. We are about to launch a new initiative giving all sellers their own stand-alone websites. URL structure:
Intermediate & Advanced SEO | | relientmark
Marketplace URL: http://www.zibbet.com/pillowlink
Stand-alone site URL: http://pillowlink.zibbet.com (doesn't work yet) Essentially, their stand-alone website is a duplicate of their marketplace store. Same items (item title, description), same seller bios, same shop introduction content etc but it just has a different layout. You can scroll down and see a preview of the different pages (if that helps you visualize what we're doing), here. My Questions: My desire is for both the sellers marketplace store and their stand-alone website to have good rankings in the SERPS. Is this possible? Do we need to add any tags (e.g. "rel=canonical") to one of these so that we're not penalized for duplicate content? If so, which one? Can we just change the meta data structure of the stand-alone websites to skirt around the duplicate content issue? Keen to hear your thoughts and if you have any suggestions for how we can handle this best. Thanks in advance!0 -
Category Pages For Distributing Authority But Not Creating Duplicate Content
I read this interesting moz guide: http://moz.com/learn/seo/robotstxt, which I think answered my question but I just want to make sure. I take it to mean that if I have category pages with nothing but duplicate content (lists of other pages (h1 title/on-page description and links to same) and that I still want the category pages to distribute their link authority to the individual pages, then I should leave the category pages in the site map and meta noindex them, rather than robots.txt them. Is that correct? Again, don't want the category pages to index or have a duplicate content issue, but do want the category pages to be crawled enough to distribute their link authority to individual pages. Given the scope of the site (thousands of pages and hundreds of categories), I just want to make sure I have that right. Up until my recent efforts on this, some of the category pages have been robot.txt'd out and still in the site map, while others (with different url structure) have been in the sitemap, but not robots.txt'd out. Thanks! Best.. Mike
Intermediate & Advanced SEO | | 945010 -
Avoiding Duplicate Content - Same Product Under Different Categories
Hello, I am taking some past advise and cleaning up my content navigation to only show 7 tabs as opposed to the 14 currently showing at www.enchantingquotes.com. I am creating a "Shop by Room" and "Shop by Category" link, which is what my main competitors do. My concern is the duplicate content that will happen since the same item will appear in both categories. Should I no follow the "Shop by Room" page? I am confused as to when I should use a no follow as opposed to a canonical tag? Thank you so much for any advise!
Intermediate & Advanced SEO | | Lepasti0 -
Duplicate peices of content on multiple pages - is this a problem
I have a couple of WordPress clients with the same issue but caused in different ways: 1. The Slash WP theme which is a portfolio theme, involves setting up multiple excerpts of content that can then be added to multiple pages. So although the pages themselves are not identical, there are the same snippets of content appearing on multiple pages 2. A WP blog which has multiple categories and/or tags for each post, effectively ends up with many pages showing duplicate excerpts of content. My view has always been to noindex these pages (via Yoast), but was advised recently not to. In both these cases, even though the pages are not identical, do you think this duplicate content across multiple pages could cause an issue? All thoughts appreciated
Intermediate & Advanced SEO | | Chammy0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0