Purchasing duplicate content
-
Morning all,
I have a client who is planning to expand their product range (online dictionary sites) to new markets and are considering the acquisition of data sets from low ranked competitors to supplement their own original data. They are quite large content sets and would mean a very high percentage of the site (hosted on a new sub domain) would be made up of duplicate content. Just to clarify, the competitor's content would stay online as well.
I need to lay out the pros and cons of taking this approach so that they can move forward knowing the full facts. As I see it, this approach would mean forgoing ranking for most of the site and would need a heavy dose of original content as well as supplementing the data on page to build around the data. My main concern would be that launching with this level of duplicate data would end up damaging the authority of the site and subsequently the overall domain.
I'd love to hear your thoughts!
-
Thanks for the great response, some really useful thoughts.
To address your final point, the site is considerably stronger than the content creator's so it's reassuring to hear that this could be the case. Of course we'll be recommending that as much of the data as possible is curated and that the pages are improved with original content/
-
Wow, this is a loaded question. The way I see it we can break this up into two parts.
First, subdomains vs. domains vs. subpages. There has been a lot of discussion surrounding which structure should be used for SEO friendliness and to keep it really simple, if you're concerned about SEO then using a subpage structure is going to be the most beneficial. If you create a separate domain, that will be duplicate content and it does impact rankings. Subdomains are a little more complex, and I don't recommend them for SEO. In some cases, Google views subdomains as spam (think of all the PBNs created with blogspot.com) and in other cases it's viewed as a separate website. By structuring something as a subdomain you're indicating that the content is different enough from the main content of the root domain that you don't feel it should be included together. An example of this being used in the wild appropriately might be different language versions of a website, which especially makes sense in countries where the TLD doesn't represent multiple languages (like Switzerland - they have four national languages).
Next, the concept of duplicate content is different depending on whether it's duplicate internally, or duplicate externally. It's common for websites to have a certain amount of duplicate or common content within their own website. The number that has been repeated for years as a "safe" threshold is 30%, which is a stat that Matt Cutts threw out there before he retired. I use siteliner.com to discover how much common content has been replicated internally. Externally, if you have the same content as another website, this can pretty dramatically impact your rankings. Google does a decent job of assigning content to the correct website (who had it first, etc.) but they have a long way to go.
If you could assimilate the new content and have the pages redirected on a 1:1 basis to the new location then it's probably safe enough to do, and hopefully you will have it structured in a way that makes it useful to users. If you can't perform the redirect, I think you're more likely to struggle with achieving SEO goals for those new pages. In that case, take the time to set realistic expectations and track something like user engagement between new and old content so you have a realistic understanding of your success and challenges.
-
I would be thinking about these topics....
** How many other companies are purchasing or have purchased this data? Is it out there on lots of sites and the number is growing?
** Since this is a low-ranking competitor, how much additional money would be required to simply buy the entire company (provided that the data is not already out there on a ton of other websites.)
** Rather than purchasing this content, what would be the cost of original authorship for just those words that produce a big bulk of the traffic. Certainly 10% of the content produces over 50% of the traffic on most reference sites.
** With knowledge that in most duplicate content situations, a significantly stronger site will crush the same content on the original publisher.... where do I sit in this comparison of power?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content Issues: Duplicate Content
Hi there
Technical SEO | | Kingagogomarketing
Moz flagged the following content issues, the page has duplicate content and missing canonical tags.
What is the best solution to do? Industrial Flooring » IRL Group Ltd
https://irlgroup.co.uk/industrial-flooring/ Industrial Flooring » IRL Group Ltd
https://irlgroup.co.uk/index.php/industrial-flooring Industrial Flooring » IRL Group Ltd
https://irlgroup.co.uk/index.php/industrial-flooring/0 -
Duplicate Footer Content Issue
Please check given screenshot URL. As per the screenshot we are using highlighted content through out the website in the footer section of our website (https://www.mastersindia.co/) . So, please tell us how Google will treat this content. Will Google count it as duplicate content or not? What is the solution in case if the Google treat it as duplicate content. Screenshot URL: https://prnt.sc/pmvumv
Technical SEO | | AnilTanwarMI0 -
I have duplicate content but // are causing them
I have 3 pages duplicated just by a / Example: https://intercallsystems.com/intercall-nurse-call-systems**//**
Technical SEO | | Renalynd
https://intercallsystems.com/intercall-nurse-call-systems**/** What would cause this?? And how would I fix it? Thanks! Rena0 -
Categories VS Tag Duplicate Content
Hello Moz community, I have a question about categories and tags . Our customer www.elshow.pe just had a redesign of its website. We use the same categories listed before . The only change was that two sub categories were added ( these sub-categories were popular tags before ) .Then now I have 2 URL's covering the same content: The first is the URL of the subcategory : www.elshow.pe/realitys/combate/ The second is the URL that is generated by the tag "combate" that is www.elshow.pe/noticias/combate/ I have the same with the second sub category: "Esto es guerra" www.elshow.pe/realitys/esto-es-guerra/ www.elshow.pe/noticias/esto-es-guerra/ The problem is when I search the keyword "combate" in my country (Perú), the URL that positions is the tag URL in 1st page. But, when I search for "esto es guerra" the URL that positions is the **sub category **in the second page. I also check in OSE both links and sub categories goes better than tags. So what do you guys recommend for this? 301 redirect? canonicals? Any coment is welcome. Thanks a lot for your time. Italo,
Technical SEO | | neoconsulting
@italominano WmzlklG.png 1RKcoX8.png0 -
Subdomain Severe Duplicate Content Issue
Hi A subdomain for our admin site has been indexed and it has caused over 2000 instances of duplicate content. To fix this issue, is a 301 redirect or canoncial tag the best option? http://www.example.com/services http://admin.example.com/services Really appreciate your advice J
Technical SEO | | Metricly-Marketing0 -
How do I deal with Duplicate content?
Hi, I'm trying SEOMOZ and its saying that i've got loads of duplicate content. We provide phone numbers for cities all over the world, so have pages like this... https://www.keshercommunications.com/Romaniavoipnumbers.html https://www.keshercommunications.com/Icelandvoipnumbers.html etc etc. One for every country. The question is, how do I create pages for each one without it showing up as duplicate content? Each page is generated by the server, but Its impossible to write unique text for each one. Also, the competition seem to have done the same but google is listing all their pages when you search for 'DID Numbers. Look for DIDWW or MyDivert.
Technical SEO | | DanFromUK0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
Duplicate Content from Google URL Builder
Hello to the SEOmoz community! I am new to SEOmoz, SEO implementation, and the community and recently set up a campaign on one of the sites I managed. I was surprised at the amount of duplicate content that showed up as errors and when I took a look in deeper, the majority of errors were caused by pages on the root domain I put through Google Analytics URL Builder. After this, I went into webmaster tools and changed the parameter handling to ignore all of the tags the URL Builder adds to the end of the domain. SEOmoz recently recrawled my site and the errors being caused by the URL Builder are still being shown as duplicates. Any suggestions on what to do?
Technical SEO | | joshuaopinion0