What is the better of 2 evils? Duplicate Product Descriptions or Thin Content?
-
It is quite labour intensive to come up with product descriptions for all of our product range ... +2500 products, in English and Spanish...
When we started, we copy pasted manufacturer descriptions so they are not unique (on the web), plus some of them repeat each other -
We are getting unique content written but its going to be a long process, so, what is the best of 2 evils, lots of duplicate non unique content or remove it and get a very small phrase from the database of unique thin content?
Thanks!
-
Very good answer - and yes, 2 bad choices but limited resources means I must choose one. Either that or Meta NOINDEX the dupes for the moment until they are re-written.
-
Good idea. Thank you.
-
I agree with you Kurt. In our space we see duplicate content everywhere, from manufacturer's sites to vendors to resellers. There is no such thing as a "duplicate content penalty." Google doesn't penalize duplicate content. They may choose to ignore it, which may feel like a penalty, but that's not technically what's going on.
I also agree with EGOL. If getting a lot of product descriptions is a daunting task, hire some writers. You can get it done for way less that you think. Need inspiration? Watch Fabio's video from MozCon 2012 where in 15-minutes he describes how he and his team created thousands of unique product descriptions in a very short amount of time without spending a lot of money: http://moz.com/videos/e-commerse-seo-tips-and-tricks
Cheers!
Dana
-
I'd take duplicate content over thin content. There are tons of eCommerce sites out there with duplicate product descriptions. I don't think that Google is going to penalize you, per se, they just might not include your pages in the search results in favor of whatever site they think is the originator of the content.
The reason I think duplicate content is better is users. Either way your search traffic is probably not going to be too great. With duplicate, the SE's may ignore your pages and with thin content you haven't given them a reason to rank you. But at least with some real content on the pages you may be be able to convert the visitors you do get.
That said, I like Egol's suggestion. Don't write new product descriptions yourself. Hire a bunch of people to do it so they can crank out the new content real quick.
Kurt Steinbrueck
OurChurch.Com -
Tom... that is some of the best that I have seen in a long time.
Thanks!
-
Nothing like a bit of hyperbole to brighten up a Tuesday, is there?!
-
I'd rather deal with the duplicate content. Personally I'd bounce quicker with Thin or no content than I would with the same content on a different but similar product page. Of course I wouldn't let the duplicate content sit there and hurt me... I'd add canonicals to pages that were similar. Now if it was the exact same content everywhere then that'd drive me nuts. But if I can look at all the products, realize how many are the same with a minor variation and how many truly different product types... then I could write content for fewer pages and consolidate link equity with the canonical without worrying about duplicate content penalizing me. Of course I could always just NoIndex those duplicate pages instead.
-
With a gun to my head....
lol... Wow. That is a great way to word this.
So, my response is, yes, put a gun to my head and I will pick between these two bad choices.
Really, if you are paying someone to write all of this content you can hire one writer and have them take a year to do it... or you can hire 12 writers and have the job done in a month. Same cost either way.
-
With a gun to my head - I'd say thin content is "better" than mass duplicate content.
This is only based on helping to remove penalties from clients' sites - I see more instances of a Panda penalty when duplicate content is present rather than 'thin' content, as it were.
However, it's important to understand how the algorithm works. It will penalise pages based on content similarity - so if a page has thin content on it - ie not a lot to differentiate it from another page on the domain - technically, Google will see it as a duplicate page, with thin content on it.
Now, my line of thinking is that if there is more content on the page, but the majority of it is duplicate - ie physically more duplicate content on the page - then Google would see this as "worse". Similarly, taking product descriptions from one domain to another, and having duplicate content from other domains, seems to be penalised more frequently than the Panda algorithm than just thin-content pages (at least in my experience).
Your mileage may vary on this, but if forced into a temporary solution, thin content may be better for SEO - but conversely worse for a user, as there is less about the product on the page. The best solution of course will be to rewrite the descriptions, but obviously there's a need for a temporary solution.
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO effect of content duplication across hub of sites
Hello, I have a question about a website I have been asked to work on. It is for a real estate company which is part of a larger company. Along with several other (rival) companies it has a website of property listings which receives a feed of properties from a central hub site - so lots of potential for page, title and meta content duplication (if if isn't already occuring) across the whole network of sites. In early investigation I don't see any of these sites ranking very well at all in Google for expected search phrases. Before I start working on things that might improve their rankings, I wanted to ask some questions from you guys: 1. How would such duplication (if it is occuring) effect the SEO rankings of such sites individually, or the whole network/hub collectively? 2. Is it possible to tell if such a site has been "burnt" for SEO purposes, especially if or from any duplication? 3. If such a site or the network has been totally burnt, are there any approaches or remedies that can be made to improve the site's SEO rankings significantly, or is the only/best option to start again from scratch with a brand new site, ensuring the use of new meta descriptions and unique content? Thanks in advance, Graham
Intermediate & Advanced SEO | | gmwhite9991 -
How would you handle this duplicate content - noindex or canonical?
Hello Just trying look at how best to deal with this duplicated content. On our Canada holidays page we have a number of holidays listed (PAGE A)
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/destinations/north-america/canada/suggested-holidays.aspx We also have a more specific Arctic Canada holidays page with different listings (PAGE B)
http://www.naturalworldsafaris.com/destinations/arctic-and-antarctica/arctic-canada/suggested-holidays.aspx Of the two, the Arctic Canada page (PAGE B) receives a far higher number of visitors from organic search. From a user perspective, people expect to see all holidays in Canada (PAGE A), including the Arctic based ones. We can tag these to appear on both, however it will mean that the PAGE B content will be duplicated on PAGE A. Would it be the best idea to set up a canonical link tag to stop this duplicate content causing an issue. Alternatively would it be best to no index PAGE A? Interested to see others thoughts. I've used this (Jan 2011 so quite old) article for reference in case anyone else enters this topic in search of information on a similar thing: Duplicate Content: Block, Redirect or Canonical - SEO Tips0 -
Wrong page ranking for keyword - should I move the better content over?
We have a page which is outranking another page for a keyword that is very important. The page that is lower in the rankings has far better content. I think this is happening due to links as well as the url structure. domain.com/ranking/notranking Here is the page we want to rank: http://bit.ly/1vqhSoZ Here is the page that is higher in rankings: http://bit.ly/1vA1wXQ So I think I should just move the content over from /notranking, to /ranking. The content is clearly better on the lower ranking page but I think due to links the /ranking page is higher in SERPS. So I guess my question is, would it be wise to move all that content over, and then 301 redirect the old page? Or leave the way it is and hopefully Google will get it right over time?
Intermediate & Advanced SEO | | DemiGR1 -
Product Colours change on ecommerce store... similar descriptions.
Hi, In the case of a RED/GREEN/YELLOW coffeemaker for example, I have say 6 pages that are indexed in google. Now, I can write very unique content for each and that gives me 6 pages in SERPS. Or make it a configurable product? What is best, and how different would the description need to be - my feeling that just changing the word colour in the text would NOT be enough. Thanks, B
Intermediate & Advanced SEO | | bjs20100 -
Artist Bios on Multiple Pages: Duplicate Content or not?
I am currently working on an eComm site for a company that sells art prints. On each print's page, there is a bio about the artist followed by a couple of paragraphs about the print. My concern is that some artists have hundreds of prints on this site, and the bio is reprinted on every page,which makes sense from a usability standpoint, but I am concerned that it will trigger a duplicate content penalty from Google. Some people are trying to convince me that Google won't penalize for this content, since the intent is not to game the SERPs. However, I'm not confident that this isn't being penalized already, or that it won't be in the near future. Because it is just a section of text that is duplicated, but the rest of the text on each page is original, I can't use the rel=canonical tag. I've thought about putting each artist bio into a graphic, but that is a huge undertaking, and not the most elegant solution. Could I put the bio on a separate page with only the artist's info and then place that data on each print page using an <iframe>and then put a noindex,nofollow in the robots.txt file?</p> <p>Is there a better solution? Is this effort even necessary?</p> <p>Thoughts?</p></iframe>
Intermediate & Advanced SEO | | sbaylor0 -
Duplicate content
I run about 10 sites and most of them seemed to fall foul of the penguin update and even though I have never sought inorganic links I have been frantically searching for a link based answer since April. However since asking a question here I have been pointed in another direction by one of your contributors. It seems At least 6 of my sites have duplicate content issues. If you search Google for "We have selected nearly 200 pictures of short haircuts and hair styles in 16 galleries" which is the first bit of text from the site short-hairstyles.com about 30000 results appear. I don't know where they're from nor why anyone would want to do this. I presume its automated since there is so much of it. I have decided to redo the content. So I guess (hope) at some point in the future the duplicate nature will be flushed from Google's index? But how do I prevent it happening again? It's impractical to redo the content every month or so. For example if you search for "This facility is written in Flash® to use it you need to have Flash® installed." from another of my sites that I coincidently uploaded a new page to a couple of days ago, only the duplicate content shows up not my original site. So whoever is doing this is finding new stuff on my site and getting it indexed on google before even google sees it on my site! Thanks, Ian
Intermediate & Advanced SEO | | jwdl0 -
Duplicate content resulting from js redirect?
I recently created a cname (e.g. m.client-site .com) and added some js (supplied by mobile site vendor to the head which is designed to detect if the user agent is a mobi device or not. This is part of the js: var CurrentUrl = location.href var noredirect = document.location.search; if (noredirect.indexOf("no_redirect=true") < 0){ if ((navigator.userAgent.match(/(iPhone|iPod|BlackBerry|Android.*Mobile|webOS|Window Now... Webmaster Tools is indicating 2 url versions for each page on the site - for example: 1.) /content-page.html 2.) /content-page.html?no_redirect=true and resulting in duplicate page titles and meta descriptions. I am not quite adept enough at either js or htaccess to really grasp what's going on here... so an explanation of why this is occurring and how to deal with it would be appreciated!
Intermediate & Advanced SEO | | SCW0 -
Duplicate content even with 301 redirects
I know this isn't a developer forum but I figure someone will know the answer to this. My site is http://www.stadriemblems.com and I have a 301 redirect in my .htaccess file to redirect all non-www to www and it works great. But SEOmoz seems to think this doesn't apply to my blog, which is located at http://www.stadriemblems.com/blog It doesn't seem to make sense that I'd need to place code in every .htaccess file of every sub-folder. If I do, what code can I use? The weirdest part about this is that the redirecting works just fine; it's just SEOmoz's crawler that doesn't seem to be with the program here. Does this happen to you?
Intermediate & Advanced SEO | | UnderRugSwept0