Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
-
Hello,
I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings.
Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions?
Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings?
Thank you for your help!
-
I think Alan and EGOL have summed it up nicely for you.
I have looked at a lot of Panda hit sites and one of the most common issues were e-commerce sites that consisted of primarily of stock product descriptions. Why would Google want to rank a site highly that just contains information that hundreds of other sites have?
If you've got a large chunk of your site containing duplicate descriptions like this then you can attract a Panda flag which can cause your whole site to not rank well, not just the product pages.
You could use the duplicate product descriptions if you had a large amount of original and helpful text around it. However, no one knows what the ratio is. If you have the ability to rewrite the product descriptions this is by far the best thing to do.
-
Just adding a point to this (and with reference to the other good points left by others) - Writing good product descriptions isn't actually that expensive!
It always seems it, as they are usually done in big batches. However on a per product basis they are pretty cheap. Do it well and you will not only improve the search results, but you can improve conversions and even make it more linkable.
Pick a product at random. Would it be worth a few £/$ to sell more of that item? If not remove it from the site anyway.
-
Adding a lot of SKUs to your site in a relatively short amount of time by borrowing content from another site sounds more like a bad sales pitch than a good "opportunity". If you don't want to put in jeopardy a significant chunk of your business, then simply drip the new sku's in as you get new content for them. The thin content's not likely to win you any new search traffic, so unless their addition is going to quickly increase sales from your existing traffic sources and quantities in dramatic fashion, why go down that road?
-
adding emphasis on the danger.
Duplicate product descriptions are the single most problematic issue ecommerce sites face from an SEO perspective. Not only are most canned descriptions so short as to cause product pages to be considered thin on content, copied/borrowed descriptions are more likely to be spread across countless sites.
While it may seem like an inordinate amount of time/cost, unique quality descriptions that are long enough to truly identify product pages as being worthy will go a long way to proving a site deserves ranking, trust.
-
You can hit Panda problems doing this. If you have lots of this content the rankings of your entire site could be damaged.
Best to write your own content, or use this content on pages that are not indexed until you have replaced with original content.
Or you could publish it to get in the index and replace as quickly as possible.
The site you are getting this content from could be damaged as well.
-
You definitely could run in to trouble here. Duplicate content of this type is meant to be dealt with on a page level basis. However if Google think it is manipulative then then it can impact on the domain as a whole. By "think" I really mean "if it matches certain patterns that manipulative sites use" - there is rarely an actual human review.
It is more complex than a simple percentage. Likely many factors are involved. However.. there is a solution!
You can simply add a no index tag to the product pages that have non-original content. That;ll keep them out of the index and keep you on the safe side of dupe issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do the back-links go wasted when anchor text or context content doesn't match with page content?
Hi Community, I have seen number of back-links where the content in that link is not matching with page content. Like page A linking to page B, but content is not really relevant beside brand name. Like page with "vertigo tiles" linked to page about "vertigo paints" where "vertigo" is brand name. Will these kind of back-links completely get wasted? I have also found some broken links which I'm planning to redirect to existing pages just to reclaim the back-links even though the content relevancy is not much beside brand name. Are these back-links are beneficial or not? Thanks
Algorithm Updates | | vtmoz0 -
Optimising meta tags: How to write them perfectly without duplicating? Impact of using different keywords?
Hi friends, Generally most of the articles about tags are either title rag or header tags, but not about both. I would like to know how to write perfect title and header tags. How much they must be relevant and different? Can we use the same tags for title and H1? If we are planning to rank for different keywords, can that different keywords can be used? I'm really curious to see some interesting answers for this. Thanks
Algorithm Updates | | vtmoz0 -
Our partners are using our website content for their websites. Do such websites hurt us due to duplicate content?
Hi all, Many of our partners across the globe are using the same content from our website and hosting on their websites including header tags, text, etc. So I wonder will these websites are hurting our website due to this duplicate content. Do we need to ask our partners to stop using our content? Any suggestions? What if some unofficial partners deny to remove the content? best way to handle? Thanks
Algorithm Updates | | vtmoz0 -
Where is this SERP listing of a product description coming from?
Google is showing a manufacturers product description below the ads and before the organic listings that I have not seen before, see the attached image. The bad part is instead of attributing it to the manufacturer it is attributing to one of our competitiors and placing thier link with the text. 1. Why is this happening? I can't find any schema or other mark-up on the page explaining where this content is coming from. 2. How do I combat this? I have not seen this type of SERP before. Any help is appreciated. HfYLGd0.jpg
Algorithm Updates | | groovecommerce0 -
While doing directory submission, We should submit unique description and title ?
Hello Moz Members, I just want to clarify that, We do directory submission in 50 of sites. For Example: I have to target 10 keyword's, and i am doing directory submission. I have 10 Unique titles and 10 unique description. I just need to submit these 10 keywords in 50 directory's 10 keywords * 50 directory = 500 submission. I will just submit the same 10 Unique titles and 10 unique description to these 500 directory. So it wont be count as duplicate content and duplicate title in every directory. Or Every time i do directory submission i have to submit unique description and unique title. Please help me with these question, I am really confused how shall i proceed to directory submission. If any one have fast approval directory sites list then please share the information with me. Regards & Thanks, Chhatarpal Singh
Algorithm Updates | | chhatarpal0 -
Duplicate content advice
Im looking for a little advice. My website has always done rather well on the search engines, although it have never ranked well for my top keywords on my main site as they are very competitive, although it does rank for lots of obscure keywords that contain my top keywords or my top keywords + City/Ares. We have over 1,600 pages on the main site most with unique content on, which is what i attribute to why we rank well for the obscure keywords. Content also changes daily on several main pages. Recently we have made some updates to the usability of the site which our users are liking (page views are up by 100%, time on site us up, bounce rate is down by 50%!).
Algorithm Updates | | jonny512379
However it looks like Google did not like the updates....... and has started to send us less visitors (down by around 25%, across several sites. the sites i did not update (kind of like my control) have been unaffected!). We went through the Panda and Penguin updates unaffected (visitors actually went up!). So i have joined SEOmoz (and loving it, just like McDonald's). I am now going trough all my sites and making changes to hopefully improve things above and beyond what we used to do. However out of the 1,600 pages, 386 are being flagged as duplicate content (within my own site), most/half of this is down to; We are a directory type site split into all major cities in the UK.
Cities that don't have listings on, or cities that have the same/similar listing on (as our users provide services to several cities) are been flagged as duplicate content.
Some of the duplicate content is due to dynamic pages that i can correct (i.e out.php?***** i will noindex these pages if thats the best way?) What i would like to know is; Is this duplicate content flags going to be causing me problems, keeping in mind that the Penguin update did not seem to affect us. If so what advise would people here offer?
I can not redirect the pages, as they are for individual cities (and are also dynamic = only one physical page but using URL rewriting). I can however remove links to cities with no listings, although Google already have these pages listed, so i doubt removing the links from my pages and site map will affect this. I am not sure if i can post my URL's here as the sites do have adult content on, although is not porn (we are an Escort Guide/Directory, with some partial nudity). I would love to hear opinions0 -
Why is there no compiled list of the different types of search results on Google, and what the content qualifications are to generate those results?
Seems to me that this list should exist out there somewhere, but I can't seem to find it. Am I just not as good of a Googler as I thought I was?
Algorithm Updates | | Draftfcb0 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0