Cross-Domain Canonical and duplicate content
-
Hi Mozfans!
I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
The thing is that the client has about 3 sites with the same Jobs on it.I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why.
Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A).Thanks!
Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday -
Every document I have seen all agrees that canonical tags are followed when the tag is used appropriately.
The tag could be misused either intentionally or unintentionally in which case it would not be honored. The tag is meant to connect pages which offer identical information, very similar information, or the same information presented in a different format such as a modified sort order, or a print version. I have never seen nor even heard of an instance where a properly used canonical tag was not respected by Google or Bing.
-
Thanks Ryan, I didn't noticed that about the reply sequencing, and you're right, I read them in the wrong order. It makes much more sense now.
By "some" support, I meant that even Google via Matt Cutts says that they don't take cross domain canonical as "a directive" but rather a "hint" (and even that assumes Google agrees with you, that your pages are duplicates).
So the magic question is how how much authority do Bing and Google give the rel="canonical" and is it similar between the two engines?
-
One aspect of the SEOmoz Q&A structure I dislike is the ordering of responses. Rather then maintaining a timeline order, the responses are re-ordered based on other factors such as "thumbs-up" and staff endorsements. I understand the concept that replies which are liked more are probably more helpful and should be seen first, but it causes confusion such as in this case.
Dr. Pete's response on the Bing cross-canonical topic appears first, but it was offered second-to-last chronologically speaking. We originally agreed there was not evidence indicating Bing supported the cross-canonical tag, then he located such evidence and therefore we agree Bing does support the tag.
The statement Dr. Pete shared was that "Bing does support cross-domain canonical". There was no limiting factor. I mention this because you said they offered "some" support and I am not sure why you used that qualifier.
-
Ryan, at the end o the thread you linked to, it seems like both Dr. Pete and yourself, agreed that there wasn't much evidence of bing support. Have you learned something that changed your mind?
I know a rep from Bing told Dr. Pete there was "some" support, but what does that mean? i.e. Exactly Identical sites pass a little juice/authority, or similar sites pass **a lot **juice/authority?
Take a product that has different brands in different parts of the country. Hellmanns's and Best Foods for example. They have two sites which are the same except for logos. Here is a recipe from each site.
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1
The sites are nearly identical except for logo's/product names.
For the (very) long tail keyword "Mayonnaise Bobby Flay Waldorf salad wrap" Best Foods ranks #5 and Hellmann's ranks #11.
I doubt they have a SEO looking very close at the sites, because in addition to their duplicate content problem, neither pages has a meta description.
If the Hellmanns page had a
[http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1](http://www.bestfoods.com/recipe_detail.aspx?RecipeID=12497&version=1)"/>
I'd expect to see the Best Foods page move up and Hellmanns move down in Google. But would Bing appears to not like the duplicate pages as much, currently the Best Food version ranks #12 and the Hellmann doesn't rank at all. My own (imperfect tests) lead me to believe that adding the rel="canonical" would help in google but not bing.
Obviously, the site owner would probably like one of those two pages to rank very high for the unbranded keyword, but they would want both pages to rank well if I added a branded term. My experience with cross-domain canonical in Google lead me to believe that even the non-canonical version would rank for branded keywords in Google, but what would Bing do?
I'd be very cautious about relying on the cross-domain canonical in Bing until I see some PUBIC announcement that it's supported. ```
-
I was bit confused when i read that. You put my mind to rest !
-
My apologies Atul. I am not sure what I was thinking when I wrote that. Please disregard.
-
Thanks Ryan!
So it will be a Canonical tag
-
I would advise NOT using the robots.txt file if at all possible. In general, the robots.txt file is a means of absolute last resort. The main reason I use the robots.txt file is because I am working with a CMS or shopping cart that does not have the SEO flexibility to noindex pages. Otherwise, the best robots.txt file is a blank one.
When you block a page in robots.txt, you are not only preventing content from being indexed, but you are blocking the natural flow of page rank throughout your site. The link juice which flows to the blocked page dies on the page as crawlers cannot access it.
-
That is correct. If you choose to read the information directly from Google it can be found here:
-
Thanks!
It's for a site in the Netherlands and google is about 98% of the market. Bing is comming up so a thing to check.
No-roboting is a way to do it i didn't think about! thanks for that. I will check with the client.
-
Thanks Ryan!
So link is like:
On the site a i will use the canonical to point everything to site A.
-
You mean rel=author on site A ? How does it help ? Where should rel=author points to ?
-
According to Dr. Pete Bing does support cross-domain canonical.
If you disagreed I would first recommend using rel=author to establish "Site A" was the source of the article.
-
A cross-domain canonical will help with Google. (make sure the pages truely are duplicate or very close), however, I haven't found any confirmation yet that Bing supports Cross Domain Canonical.
If the other sites don't need to rank at all, you could also consider no-roboting the job pages on the other sites, so that your only Site A's job listings get indexed.
-
Yes. A cross-domain canonical would solve the duplicate content issue and focus on the main site's ranking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to handle duplicate content with Bible verses
Have a friend that does a site with bible verses and different peoples thoughts or feelings on them. Since I'm an SEO he came to me with questions and duplicate content red flag popped up in my head. My clients all generate their own content so not familiar with this world. Since Bible verses appear all over the place, is there a way to address this from an SEO standpoint to avoid duplicate content issues? Thanks in advance.
Intermediate & Advanced SEO | | jeremyskillings0 -
If a website trades internationally and simply translates its online content from English to French, German, etc how can we ensure no duplicate content penalisations and still maintain SEO performance in each territory?
Most of the international sites are as below: example.com example.de example.fr But some countries are on unique domains such example123.rsa
Intermediate & Advanced SEO | | Dave_Schulhof0 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
Moving some content to a new domain - best practices to avoid duplicate content?
Hi We are setting up a new domain to focus on a specific product and want to use some of the content from the original domain on the new site and remove it from the original. The content is appropriate for the new domain and will be irrelevant for the original domain and we want to avoid creating completely new content. There will be a link between the two domains. What is the best practice for this to avoid duplicate content and a potential Panda penalty?
Intermediate & Advanced SEO | | Citybase0 -
WMT Index Status - Possible Duplicate Content
Hi everyone. A little background: I have a website that is 3 years old. For a period of 8 months I was in the top 5 for my main targeted keyword. I seemed to have survived the man eating panda but not so sure about the blood thirsty penguin. Anyway; my homepage, along with other important pages, have been wiped of the face of Google's planet. First I got rid of some links that may not have been helping and disavowed them. When this didn't work I decided to do a complete redesign of my site with better content, cleaner design, removed ads (only had 1) and incorporated social integration. This has had no effect at all. I filed a reconsideration request and was told that I have NOT had any manual spam penalties made against me, by the way I never received any warning messages in WMT. SO, what could be the problem? Maybe it's duplicate content? In WMT the Index Status indicates that there are 260 pages indexed. However; I have only 47 pages in my sitemap and when I do a site: search on Google it only retrieves 44 pages. So what are all these other pages? Before I uploaded the redesign I removed all the current pages from the index and cache using the remove URL tool in WMT. I should mention that I have a blog on Blogger that is linked to a subdomain on my hosting account i.e. http://blog.mydomain.co.uk. Are the blog posts counted as pages on my site or on Blogger's servers? Ahhhh this is too complicated lol Any help will be much appreciated! Many thanks, Mark.
Intermediate & Advanced SEO | | Nortski0 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0 -
Diagnosing duplicate content issues
We recently made some updates to our site, one of which involved launching a bunch of new pages. Shortly afterwards we saw a significant drop in organic traffic. Some of the new pages list similar content as previously existed on our site, but in different orders. So our question is, what's the best way to diagnose whether this was the cause of our ranking drop? My current thought is to block the new directories via robots.txt for a couple days and see if traffic improves. Is this a good approach? Any other suggestions?
Intermediate & Advanced SEO | | jamesti0 -
Domain w/ Identical Content to Site we are Optimizing
Hi Guys, We've been optimizing a client's site for about a year or so now and on a call the other day the client brought up that he owns and operates another site that's marketing the same product, but to a difference audience (we work on the direct to consumer side, this is a distributior focused site),with the same exact content as the site we are optimizing. Obviously this is a major duplcant content issue and we need to get it resolved very quickjly. We have already reccomendt to the client that we re-write content, but this is where my questions comes in - Which site should we rewrite the content on? The site we are optimizing is the more impoorant of the two, while we still want the other site to hold rankings we dont want to end up accidently optimizing the other site wherein the site we are working on full time suffers a lost when a "compeiting" site creates compeltely new content and may, accidentally, end up ranking higher than the site we are focusing on full time. As links also play a role, would that be a KPI to look at here in determining which site gets new content and which does not? In this scenairo, would would you guys recommend? Just want to make sure I'm dotting all my I's, and crossing T's here. Many thanks to all in advance, Mike
Intermediate & Advanced SEO | | Havas_Disco0