Letting Others Use Our Content: Risk-Free Attribution Methods
-
Hello Moz!
A massive site that you've all heard of is looking to syndicate some of our original editorial content. This content is our bread and butter, and is one of the primary reasons why people use our site.
Note that this site is not a competitor of ours - we're in different verticals.
If this massive site were to use the content straight up, I'm fairly confident that they'd begin to outrank us for related terms pretty quickly due to their monstrous domain authority.
This is complex because they'd like to use bits and pieces of the content interspersed with their own content, so they can't just implement a cross-domain canonical. It'd also be difficult to load the content in an iframe with noindex,nofollow header tags since their own content (which they want indexed) will be mixed up with ours.
They're also not open to including a link back to the product pages where the corresponding reviews live on our site.
Are there other courses of action that could be proposed that would protect our valuable content?
Is there any evidence that using schema.org (Review and Organization schemas) pointing back to our review page URLs would provide attribution and prevent them from outranking us for associated terms?
-
Logan, I found your replies very helpful. We have allowed a site to replicate some of our pages / content on their site and have the rel canonical tag in place pointing back to us. However, Google has indexed the pages on the partner's site as well. Is this common or has something gone wrong? the partner temporarily had an original source tag pointing to their page as well as the canonical pointing to us. We caught this issue a few weeks ago and had the original source tag removed. GSC sees the rel canonical tag for our site. But I am concerned our site could be getting hurt for dupe content issues and the partner site may out rank us as their site is much stronger. Any insight would be greatly appreciated
-
"Why did this offer come my way?"
When someone asks to use your content, that is what you should be asking yourself.
When someone asks to use my content, my answer is always a fast. NO! Even if the Pope is asking, the answer will be NO.
-
This is exactly my concern. Our site is massive in it's own industry, but this other site is a top player across many industries - surely we'd be impacted by such an implementation without some steps taken to confirm attribution.
Thank you for confirming my suspicions.
-
Google claims that they are good at identifying the originator of the content. I know for a fact that they are overrating their ability on this.
Publish an article first on a weak site, allow it to be crawled and remain for six months. Then, put that same article on a powerful site. The powerful site will generally outrank the other site for the primary keywords of the article or the weak site will go into the supplemental results. Others have given me articles with the request that I publish them. After I published them they regretted that they were on my site.
Take pieces of an article from a strong site and republish them verbatim on a large number of weak sites. The traffic to the article on the strong site will often drop because the weak sites outrank it for long-tail keywords. I have multiple articles that were ranking well for valuable keywords. Then hundreds of mashup sites grabbed pieces of the article and published them verbatim. My article tanked in the SERPs. A couple years later the mashups fell from the SERPs and my article moved back up to the first page.
-
But, I would not agree with their site being the one to take the damage. YOU will lose a lot of long-tail keyword traffic because now your words are on their site and their site is powerful.
Typically, the first one that's crawled will be considered the originator of the content--then if a site uses that content it will be the one who is damaged (if that's the case). I was under the impression that your content was indexed first--and the other site will be using your content. At least that's the way I understood it.
So, if your content hasn't already been indexed then you may lose in this.
-
This is complex because they'd like to use bits and pieces of the content interspersed with their own content, so they can't just implement a cross-domain canonical. It'd also be difficult to load the content in an iframe with noindex,nofollow header tags since their own content (which they want indexed) will be mixed up with ours.
Be careful. This is walking past the alligator ambush. I agree with Eric about the rel=canonical. But, I would not agree with their site being the one to take the damage. YOU will lose a lot of long-tail keyword traffic because now your words are on their site and their site is powerful.
They're also not open to linking back to our content.
It these guys walked into my office with their proposal they might not make it to the exit alive.
My only offer would be for them to buy me out completely. That deal would require massive severances for my employees and a great price for me.
-
You're in the driver's seat here. _You _have the content _they _want. If you lay down your requirements and they don't want to play, then don't give them permission to use your content. It's really that simple. You're gaining nothing here with their rules, and they gain a lot. You should both be winning in this situation.
-
Thank you for chiming in Eric!
There pages already rank extraordinarily well. #1 for almost every related term that they have products for, across the board.
They're also not open to linking back to our content.
-
In an ideal situation, the canonical tag is preferred. Since you mentioned that it's not the full content, and you can't implement it, then there may be limited options. We haven't seen any evidence that pointing back to your review page URLs would prevent them from outranking you--but it's not likely. If there are links there, then you'd get some link juice passed on.
Most likely, though, if that content is already indexed on your site then it's going to be seen as duplicate content on their site--and would only really hurt their site, in that those pages may not rank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I delete all tags and just use my categories to organize content?
My website NorthernCaliforniaHikingTrails.com/blog has 400 or so tags, and it also has an extensive set of categories. I'm thinking about deleting all the tags, but keeping the categories and consolidating them a bit. Is there a significant SEO advantage to having tags in my case? I've seen a few very high-ranking websites actually rank for a tag, but I doubt my site will reach that level. Any help appreciated!
Intermediate & Advanced SEO | | John88990 -
Duplicate content across domains?
Does anyone have suggestions for managing duplicate product/solution website content across domains? (specifically parent/child company domains) Is it advisable to do this? Will it hurt either domain? Any best practices when going down this path?
Intermediate & Advanced SEO | | pilgrimquality0 -
Penalty for adding too much content too quickly?
Hi there, We released around 4000 pieces of new content, which all ranked in the first page and did well. We had a database of ~400,000 pieces and so we released the entire library in a couple of days (all remaining 396,000 pages). The pages have indexed. The pages are not ranking, although the initial batch are still ranking as are a handful (literally a handful) of the new 396,000. When I say not ranking - I mean not ranking anywhere (gone up as far as page 20), yet the initial batch we'd be ranking for competitive terms on page 1. Do Google penalise you for releasing such a volume of content in such a short space of time? If so, should we deindex all that content and re-release in slow batches? And finally, if that is the course of action we should take is there any good articles around deindexing content at scale. Thanks so much for any help you are able to provide. Steve
Intermediate & Advanced SEO | | SteveW19870 -
ROI on Policing Scraped Content
Over the years, tons of original content from my website (written by me) has been scraped by 200-300 external sites. I've been using Copyscape to identify the offenders. It is EXTREMELY time consuming to identify the site owners, prepare an email with supporting evidence (screen shots), and following up 2, 3, 15 times until they remove the scraped content. Filing DMCA takedowns are a final option for sites hosted in the US, but quite a few of the offenders are in China, India, Nigeria, and other places not subject to DMCA. Sometimes, when a site owner takes down scraped content, it reappears a few months or years later. It's exasperating. My site already performs well in the SERPs - I'm not aware of a third party site's scraped content outperforming my site for any search phrase. Given my circumstances, how much effort do you think I should continue to put into policing scraped content?
Intermediate & Advanced SEO | | ahirai1 -
Should We Remove Content Through Google Webmaster Tools?
We recently collapsed an existing site in order to relaunch it as a much smaller, much higher quality site. In doing so, we're facing some indexation issues whereas a large number of our old URLs (301'd where appropriate) still show up for a site:domain search. Some relevant notes: We transitioned the site from SiteCore to Wordpress to allow for greater flexibility The Wordpress CMS went live on 11/22 (same legacy content, but in the new CMS) The new content (and all required 301s) went live on 12/2 The site's total number of URLS is currently at 173 (confirmed by ScreamingFrog) As of posting this question, a site:domain search shows 6,110 results While it's a very large manual effort, is there any reason to believe that submitting removal requests through Google Webmaster Tools would be helpful? We simply want all indexation of old pages and content to disappear - and for Google to treat the site as a new site on the same old domain.
Intermediate & Advanced SEO | | d50-Media0 -
Content position and topic modelling
Hi, Two questions here, First: Does the position of content have any impact on performance? For example say a page displays a league table (20 rows) so eats up most of the above-fold space. Would that table being top followed by content have a negative impact? Would creating 'some' content before a table help? Second: Does topic modelling actually help relevance signals? So say I sold guitars and the page had the word 'guitar' throughout the content, would including electric, acoustic, strings, amps etc also in the content help the page become more relevant for the term 'guitar'? Or would it just expand the terms the page would be eligible to show for? Thanks.
Intermediate & Advanced SEO | | followuk1 -
Content per page?
We used to have an articles worth of content in a scroll box created by our previous SEO, the problem was that it was very much keyword stuffed, link stuffed and complete crap. We then removed this and added more content above the fold, the problem I have is that we are only able to add 150 - 250 words above the fold and a bit of that is repetition across the pages. Would we benefit from putting an article at the bottom of each of our product pages, and when I say article I mean high quality in depth content that will go into a lot more detail about the product, history and more. Would this help our SEO (give the page more uniqueness and authority rather than 200 - 250 word pages). If I could see one problem it would be would an articles worth of content be ok at the bottom of the page and at that in a div tab or scroll box.
Intermediate & Advanced SEO | | BobAnderson0 -
Why duplicate content for same page?
Hi, My SEOMOZ crawl diagnostic warn me about duplicate content. However, to me the content is not duplicated. For instance it would give me something like: (URLs/Internal Links/External Links/Page Authority/Linking Root Domains) http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110516 /1/1/31/2 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110711 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110811 0/0/1/0 http://www.nuxeo.com/en/about/contact?utm_source=enews&utm_medium=email&utm_campaign=enews20110911 0/0/1/0 Why is this seen as duplicate content when it is only URL with campaign tracking codes to the same content? Do I need to clean this?Thanks for answer
Intermediate & Advanced SEO | | nuxeo0