Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
-
Hello,
I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings.
Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions?
Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings?
Thank you for your help!
-
I think Alan and EGOL have summed it up nicely for you.
I have looked at a lot of Panda hit sites and one of the most common issues were e-commerce sites that consisted of primarily of stock product descriptions. Why would Google want to rank a site highly that just contains information that hundreds of other sites have?
If you've got a large chunk of your site containing duplicate descriptions like this then you can attract a Panda flag which can cause your whole site to not rank well, not just the product pages.
You could use the duplicate product descriptions if you had a large amount of original and helpful text around it. However, no one knows what the ratio is. If you have the ability to rewrite the product descriptions this is by far the best thing to do.
-
Just adding a point to this (and with reference to the other good points left by others) - Writing good product descriptions isn't actually that expensive!
It always seems it, as they are usually done in big batches. However on a per product basis they are pretty cheap. Do it well and you will not only improve the search results, but you can improve conversions and even make it more linkable.
Pick a product at random. Would it be worth a few £/$ to sell more of that item? If not remove it from the site anyway.
-
Adding a lot of SKUs to your site in a relatively short amount of time by borrowing content from another site sounds more like a bad sales pitch than a good "opportunity". If you don't want to put in jeopardy a significant chunk of your business, then simply drip the new sku's in as you get new content for them. The thin content's not likely to win you any new search traffic, so unless their addition is going to quickly increase sales from your existing traffic sources and quantities in dramatic fashion, why go down that road?
-
adding emphasis on the danger.
Duplicate product descriptions are the single most problematic issue ecommerce sites face from an SEO perspective. Not only are most canned descriptions so short as to cause product pages to be considered thin on content, copied/borrowed descriptions are more likely to be spread across countless sites.
While it may seem like an inordinate amount of time/cost, unique quality descriptions that are long enough to truly identify product pages as being worthy will go a long way to proving a site deserves ranking, trust.
-
You can hit Panda problems doing this. If you have lots of this content the rankings of your entire site could be damaged.
Best to write your own content, or use this content on pages that are not indexed until you have replaced with original content.
Or you could publish it to get in the index and replace as quickly as possible.
The site you are getting this content from could be damaged as well.
-
You definitely could run in to trouble here. Duplicate content of this type is meant to be dealt with on a page level basis. However if Google think it is manipulative then then it can impact on the domain as a whole. By "think" I really mean "if it matches certain patterns that manipulative sites use" - there is rarely an actual human review.
It is more complex than a simple percentage. Likely many factors are involved. However.. there is a solution!
You can simply add a no index tag to the product pages that have non-original content. That;ll keep them out of the index and keep you on the safe side of dupe issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How much content is duplicate content? Differentiate between website pages, help-guides and blog-posts.
Hi all, I wonder that duplicate content is the strong reason beside our ranking drop. We have multiple pages of same "topic" (not exactly same content; not even 30% similar) spread across different pages like website pages (product info), blog-posts and helpguides. This happens with many websites and I wonder is there any specific way we need to differentiate the content? Does Google find the difference across website pages and blog-pots of same topic? Any good reference about this? Thanks
Algorithm Updates | | vtmoz0 -
Is good for SEO update blog post dates after update post content
Hello I am updating some posts of my Blog, adding new and fresh content and rewriting some of the existing. After doing that I am thinking to update de post publishing so that I appears on front page of the blog and user can read ir again. But I don't know if it is good for google to change the publishing date of the post that he had indexed 5 years ago. Also I don't know if google will read it again if it is old and see the new changes in order to improve it in search results
Algorithm Updates | | maestrosonrisas0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
Same Meta description is being shown on Google?
Not sure why this is happening but when you this command into Google site:"mywebsite": + "key phrase" It brings up pages from my website which have the key phrase but I have noticed that Google is using the wrong meta description for all of them even though these pages all have their own unique meta description Does anyone know why this would be happening? Thanks
Algorithm Updates | | webguru20140 -
Did Google update the length of characters allowed in Meta Description?
Hey all, I do SEO. I'm currently working with another SEO firm on a project. The lady mentioned to me that Google recently updated (couple months ago) and changed their font causing them to lower the meta description to 55 characters. Is this true? I have not heard of this. Could she be confusing the meta description with the title tag? I didn't know Google could have even update the Title tag too.
Algorithm Updates | | ColeLusby0 -
Penalty for Mixing Microdata with Metadata
The folks that built our website have insisted on including microdata and metadata on our pages. What we end up with is something that looks like this in the header: itemprop="description" content="Come buy your shoes from us, we've got great shoes."> Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other. Can anyone provide insight on this?
Algorithm Updates | | markcely0 -
How to build good content and choose right keywords.?
I have started building content for our website using the Wordpress tool. Now I wanted to know that I use GA and the Adwords keyword tool. I go in for exact matching keywords and have selected a few of them. How do I know if these keywords are actually the ones going to give me good traffic? How can I select good keywords and write content along them. I don't wish to over stuff articles with the keywords. How can I refrain from doing so. Any optimum limit through which I know how much of the keyword needs to occur how many times within an article? Please give some good insights as to how this is accomplished? Thanks
Algorithm Updates | | shanky11 -
Issue: Having to Fight Product Marketing to Use Keywords
This is sort of a "DUH!" moment to me. I know everyone has come across this at some point in time and am interested in hearing how others deal with this. A little background: I was researching keywords for new menus and pages. Sometimes, people (product marketing in my case) do not give me a heads up on changes they want to make to pages and it is always a fight with them to change it. This is pretty normal for me and I am use to it. It is one of those things that they don't want to discuss it with you because they know you are going to critique their work. and, yes, change it for the good of the company. I had a co-worker say to me:
Algorithm Updates | | SmartBear
"We may have to start making [pages] meaningful to the human visitor than satify the bot army". My response was:
"What better way to make it [web page] meaningful to users by knowing which terms they search on the most in our industry? Keyword research is not just for Search Engines, it is actual live data as to what most people are searching. That is why I put such a high precedence on it and report on trends. You can bet that if 100,000 people are searching for [keyword], that is what they want to see when they search for it." Anyways, that is how I handled this particular event. I have several responses when these comments pop up from time to time. Usually it is something to the fact that they are not the ones who will get fired if leads drop via organic search, so we better try this. But today, I was feeling kind of spunky and decided to take another route. What are some of your responses to these types of remarks? Hopefully this will make for a good discussion.0