Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
-
Hello,
I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings.
Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions?
Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings?
Thank you for your help!
-
I think Alan and EGOL have summed it up nicely for you.
I have looked at a lot of Panda hit sites and one of the most common issues were e-commerce sites that consisted of primarily of stock product descriptions. Why would Google want to rank a site highly that just contains information that hundreds of other sites have?
If you've got a large chunk of your site containing duplicate descriptions like this then you can attract a Panda flag which can cause your whole site to not rank well, not just the product pages.
You could use the duplicate product descriptions if you had a large amount of original and helpful text around it. However, no one knows what the ratio is. If you have the ability to rewrite the product descriptions this is by far the best thing to do.
-
Just adding a point to this (and with reference to the other good points left by others) - Writing good product descriptions isn't actually that expensive!
It always seems it, as they are usually done in big batches. However on a per product basis they are pretty cheap. Do it well and you will not only improve the search results, but you can improve conversions and even make it more linkable.
Pick a product at random. Would it be worth a few £/$ to sell more of that item? If not remove it from the site anyway.
-
Adding a lot of SKUs to your site in a relatively short amount of time by borrowing content from another site sounds more like a bad sales pitch than a good "opportunity". If you don't want to put in jeopardy a significant chunk of your business, then simply drip the new sku's in as you get new content for them. The thin content's not likely to win you any new search traffic, so unless their addition is going to quickly increase sales from your existing traffic sources and quantities in dramatic fashion, why go down that road?
-
adding emphasis on the danger.
Duplicate product descriptions are the single most problematic issue ecommerce sites face from an SEO perspective. Not only are most canned descriptions so short as to cause product pages to be considered thin on content, copied/borrowed descriptions are more likely to be spread across countless sites.
While it may seem like an inordinate amount of time/cost, unique quality descriptions that are long enough to truly identify product pages as being worthy will go a long way to proving a site deserves ranking, trust.
-
You can hit Panda problems doing this. If you have lots of this content the rankings of your entire site could be damaged.
Best to write your own content, or use this content on pages that are not indexed until you have replaced with original content.
Or you could publish it to get in the index and replace as quickly as possible.
The site you are getting this content from could be damaged as well.
-
You definitely could run in to trouble here. Duplicate content of this type is meant to be dealt with on a page level basis. However if Google think it is manipulative then then it can impact on the domain as a whole. By "think" I really mean "if it matches certain patterns that manipulative sites use" - there is rarely an actual human review.
It is more complex than a simple percentage. Likely many factors are involved. However.. there is a solution!
You can simply add a no index tag to the product pages that have non-original content. That;ll keep them out of the index and keep you on the safe side of dupe issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Bot Crawling issues
Dear all, Is this cache:www.subhavaastu.com is not working now with Google. Why it is not showing when the site is crawled. What were the new algorithms which Google is adopted? in my searches of all of my site internal links of www.SubhaVaastu.com, I observed only 404 instead of Google visiting time and date. After observing this 404 of each of my site links, I understand Google stopped crawling my site. Some more examples are shown below with other websites: cache:www.vastuwebsite.com ("NOT" showing when Google visited this site) cache:www.vastuconsultantusa.com ("NOT" showing when Google visited this site) cache:www.shubhavaastu.com ("NOT" showing when Google visited this site) cache:www.subhavastu.com (Showing when Google visited this site) To my surprise, I noticed that Google crawled latest links in my site, which I added a new link (https://www.subhavaastu.com/remove-negativity.html) just 10 days back, this new link was clearly crawled by Google. I typed "remove negativity subhavaastu", I saw the results with this new page in the SERP, but on the same way when I typed "cache:www.subhavaastu.com/remove-negativity.html", it is showing again 404. what is happening with Google, is Google is following any new algorithms now. Is google changed any new concept? or, is my site is penalized in any case, I think it may not be, because if my site is penalized, then Google should not visit the new links and should not show the results with my site. Coming with my site, its pure from viruses, no malicious codes. Indeed it's an article based site, which has a good reputation. This domain is taken in the year 2003. We never spam anywhere we never did any wrong methods. If my site is penalized, then is it manual penalized or bot penalized. I thoroughly checked webmaster tools and google console, I never found any notices or any note from Google. Require experts analyzation on this doubt. Thanks in Advance.
Algorithm Updates | | SubhaVaastu0 -
How to hide our duplicate pages from SERP? Best practice to increase visibility to new pages?
Hi all, We have total 4 pages about same topic and similar keywords. These pages are from our main domain and sub domains too. As the pages from sub domains are years old and been receiving visits from SERP, they stick to 1st position. But we have recently created new pages on our main domain which we are expecting to rank on 1st position. I am planning to hide the sub domain pages from SERP using "Remove URLs" for some days to increase visibility to new pages from main domain. Is this the right and best practice to proceed with? Thanks
Algorithm Updates | | vtmoz0 -
Anyone else notice a global Google cache issue?
Im noticing a huge % of pages on my sites and those picked at random from Google searches, high traffic pages from big sites like booking.com have not been cached since 22nd Oct by Google. Anyone else noticed this or got insight on it? Andy
Algorithm Updates | | AndyMacLean0 -
What to do with old, outdated and light content on a blog?
So there's a blog I recently took over - that over the past 2 years has great content. However, with their 800+ published posts. I'd say that 250-300 posts are light in content, that's nothing more than a small paragraph with no real specificity on what its about - more like general updates. Now what would best practice be; optimizing all of the posts or deleting the posts and 301'ing the URL to another post/the root?
Algorithm Updates | | simplycary0 -
Any SEO thoughts about Google's new Data Highlighter for products?
After searching around on the web for a while I couldn't find any case studies or interesting posting about Google's new feature to highlight structured data. In Google Webmaster Tools you can now tag your products to be displayed as structured data in Google's search results. Two questions that rose immediately: 1. What effect will Google's new Data Hightlighter for products have on your SEO? Can we expect better CTR's for productspage results in Google? Better conversion rates perhaps? Any case studies that show KPI improvements after using structured data for products? 2. I would love to see some examples in the search results to see what productpages would look like after Data Highlighting it. Your thoughts or input about this subject will be much appreciated.
Algorithm Updates | | SDIM0 -
How can I use Intuit without getting duplicate content issues
All of my Intuit site show duplicate content on the index pages. How can I avoid this
Algorithm Updates | | onestrohm0 -
Redesign, new content, new domain and 301 redirects (penalty?)
We merged our old webshops into one big project. After a few days we received our rankings back and traffic was coming in. Then suddenly we lost almost all rankings overnight. We did not use any wrong seo techniques and have unique content, written by our own writers. Is this a penalty or do we have to wait longer?
Algorithm Updates | | snorkel0 -
SEOmoz suddenly reporting duplicate content with no changes???
I am told the crawler has been updated and wanted to know if anyone else is seeing the same thing I am. SEOmoz reports show many months of no duplicate content problems. As of last week though, I get a little over a thousand pages reported as dupe content errors. Checking these pages I find there is similar content (hasn't changed) with keywords that are definitely different. Many of these pages rank well in Google, but SEOmoz is calling them out as duplicate content. Is SEOmoz attempting to closely imitate Google's perspective in this matter and therefore telling me that I need to seriously change the similar content? Anyone else seeing something like this?
Algorithm Updates | | Corp0