Duplicate content on partner site
-
I have a trade partner who will be using some of our content on their site. What's the best way to prevent any duplicate content issues?
Their plan is to attribute the content to us using rel=author tagging. Would this be sufficient or should I request that they do something else too?
Thanks
-
Cross-domain canonical is the most viable option here. As Mike and Chris said, it is possible for Google to ignore the tag in some cases, but it's a fairly strong suggestion. There are two main reasons I'd recommend it:
(1) Syndicated content is the entire reason Google allowed the use of rel=canonical across domains. SEOs I know at large publishers have used it very effectively. While your situation may not be entirely the same, it sounds similar to a syndicated content scenario.
(2) It's really your only viable option. While a 301-redirect is almost always honored by Google, as Chris suggested, it's also very different. A 301 will take the visitors on the partner site page directly to your page, and that's not your intent. Rel=canonical will leave visitors on the partner page, but tell search engines to credit that page to the source. Google experimented with a content syndication tag, but that tag's been deprecated, so in most cases rel=canonical is the best choice we have left.
-
As far as I'm aware and webmaster guide lines are the following is true :
"Can rel="canonical" be used to suggest a canonical URL on a completely different domain?
There are situations where it's not easily possible to set up redirects. This could be the case when you need to migrate to a new domain name using a web server that cannot create server-side redirects. In this case, you can use the
rel="canonical"
link element to specify the exact URL of the domain preferred for indexing. While therel="canonical"
link element is seen as a hint and not an absolute directive, we do try to follow it where possible."canonical is for on page more than off site.
Supporting this Matt Cutts mentions that they prefer 301
So bit of truth in it
-
My favorite answer... Canonicals. If your trade partner's site places rel="canonical" tags pointing back to the original source of the content on your site then there shouldn't be any duplicate content issue. Of course Canonicals are suggestions not directives so the search engines reserve the right not to follow the tag if they deem it irrelevant. Using the tag in this way will essentially pass all the equity to your site and rank your page instead of your trade partner. Your trade partner would basically get no benefit of having your content as far as Search is concerned. The better option for everyone would likely be to write unique and relevant content.
-
Hi,
You may want to read the following :
https://support.google.com/webmasters/answer/66359?hl=en
Technically you should be fine though I never recommend duplicate content across sites It reduced the quality across both sites. As long as there is a link back to original source you should be ok.
-
Hi Chris. I don't care about the trade partner. But are you saying I could receive a penalty if they copy and paste content off my website? Surely that's not fair!
-
Easy fix - don't use duplicate content!
You will still receive a penalty it's better to take the time to rewrite or get fresh content.
They can link to your site if they want to use the content as the user would still see the content but just putting a duplicate of the content on their site will result in a drop for the both of you although it may not happen right away it will over time
Hope this help, and good luck!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Page Content - default.html
I am showing a duplicate content error in moz. I have site.com and site.com/default.html How can I fix that? Should I use a canonical tag? If so, how would i do that?
On-Page Optimization | | bhsiao0 -
How best to deal with internal duplicate content
hi having an issue with a client site and internal duplicate content. The client has a custom cms and when they post new content it can appear, in full, at two different urls on the site. Short of getting the client to move cms, which they won't do, I am trying to find an easy fix that they could do themselves. ideally they would add a canonical on one of the versions but the cms does allow them to view posts in html view, also would be a lot if messing about wth posting the page and then going back to the cms and adding the tag. the cms is unable to auto generate this either. The content editors are copywriters not programmers. Would there be a solution using wmt for this? They have the skill level to be able to add a url in wmt so im thinking that a stop gap solution could be to noindex one of the versions using the option in webmaster tools. Ongoing we will consult developers about modifying the cms but budgets are limited so looking for a cheap and quick solution to help until the new year. anyone know of a way other than wmt to block Google from seeing duplicate content. We can block Google from folders because only a small percentage of the content in the folder would be internally duplicate. would be very grateful for any suggestions anyone could offer. thanks.
On-Page Optimization | | daedriccarl0 -
Duplicate Page content | What to do?
Hello Guys, I have some duplicate pages detected by MOZ. Most of the URL´s are from a registracion process for users, so the URL´s are all like this: www.exemple.com/user/login?destination=node/125%23comment-form What should I do? Add this to robot txt? If so how? Whats the command to add in Google Webmaster? Thanks in advance! Pedro Pereira
On-Page Optimization | | Kalitenko20140 -
Duplicate content, is it ever ok?
I am building a large site for a client who sells physical products. I am using WordPress as my CMS (as a piece of background information). There are a few products that need to be listed in the sites hierarchy in multiple locations as such: Parent A Child 1 Parent B Child 2 Child 3 Parent C Child 1 I am concerned that having a product exist in multiple instances will cause indexing problems for that product. I can't be the only person to come across this issue, would love some feedback on the best practices for such an issue. Thanks in advance
On-Page Optimization | | Qcmny0 -
Duplicate Content only an Issue on a Huge Scale?
To what extent is duplicate content an issue? We have a support forum with some duplicate content because users ask the same questions. The Moz reports we receive highlights our duplicate content and page title for our support forum as a "big" issue. I'm unsure to what extent it harms our SEO, and making the support section non-crawable would impair our level of support. It would be nice to know for sure if we should be concerned about this, and if yes, how can we do it differently? Thanks, I appreciate you help. -Allan
On-Page Optimization | | Todoist0 -
Index Page Content
Mozers, I am of the believe and as a person who puts the utmost emphasis on the index page of any website I am trying to rank, especially with a new domain ... insuring content is relevant, structured, optimized and we have some link juice flowing in. I find once we get the index page ranked, Google's little bots then start to index and rank accordingly the rest of the website ... and we start producing results. We also develop websites (dare I say its where we expertise in) and unexpectantly the client has asked us to carry out SEO work additionally to their web development. Problem lies here, their index page, has absolutely no written content at all, just one large image with a logo (Fashion Website) ...Which I identify as a huge issue as per my explanation is paragraphs one or two. I am sure withe the many more qualified SEO experts and gurus within the SEOmoz community, you have also come across this issue So a few questions, if you don't mind adding advice. 1 - Am I putting too much emphasize on content within the index page, in terms of indexing and actually ranking ...yes I appreciate that terms within the website will be ranked against other pages other than the index page, but will it harm us for having no content at all within the index page 2 - If so, and yes is the answer to above, how do we handle it, we have spoke with the client and he is pretty adamant that he want the index page as is, he has been through out the whole website building process. As suggested, any advice would be really appreciated, its a difficult market to rank within a it is, and i can only see this index page making the task a lot more difficult Cheers John
On-Page Optimization | | Johnny4B0 -
Meta descriptions better empty or with duplicate content?
I am working with a yahoo store. Somehow all of the meta description fields were filled in with random content from throughout the store. For example, a black cabinet knob product page might have in its description field the specifications for a drawer slide. I don't know how this happened. We have had a programmer auto populate certain fields to get them ready for product feeds, etc. It's possible they screwed something up during that, this was a long time ago. My question. Regardless of how it happened. Is it better for me to have them wipe these fields entirely clean? Or, is it better for me to have them populate the fields with a duplicate of our text from the body. The site has about 6,500 pages so I have and will make custom descriptions for the more important pages after this process, but the workload to do them all is too much. So, nothing or duplicate content for the pages that likely won't receive personal attention?
On-Page Optimization | | dellcos1 -
Exponentially Increasing Duplicate Content On Blogs
Most of the clients that I pick up are either new to SEO best practices, or have worked with sketchy SEO providers in the past, who did little more than build spammy links. Most of them have deployed little if any on-site SEO best practices, and early on I spend a lot of time fixing canonical and duplicate content issues alla 301 redirects. Using SEOMOZ, however, I see a lot of duplicate content issues with blogs that live on the sites I work on. With every new blog article we publish, more duplicate content builds up. I feel like duplicate content on blogs grows exponentially, because every time you write a blog article, it exists provisionally on the blog homepage, the article link, a category page, maybe a tag page, and an author page. I have a two-part question: Is duplicate content like this a problem for a blog -- and for the website that the blog lives on? Are search engines able to parse out that this isn't really duplicate content? If it is a problem, how would you go about solving it? Thanks in advance!
On-Page Optimization | | RCNOnlineMarketing0