Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Duplicate eCommerce Product Descriptions
-
I know that creating original product descriptions is best practices. What I don't understand is how other sites are able to generate significant traffic while still using duplicate product descriptions on all product pages. How are they not being penalized by Google?
-
From my experience as an SEO for a large eCommerce site (our own products), I tend to think that Google has a way of recognizing eCommerce site from purely informational ones and takes that into consideration when analyzing content.
As you say Chris, many producers will distribute their catalogs to all their dealers and they in turn will put those online. The same happens with our products here. Our dealers use the very description we provide them with and no one has ever been penalized for that.
As said, I personally think that Google takes the intent of your site (eCommerce, informational etc. ) into consideration when slapping duplicate content penalties.
Having said that, i have no data to back up that claim so go easy on me, it's only based on my gut feeling and practical observations.
-
I can definitely understand the frustration, but Google won't penalize sites for simply having duplicate content, and especially storefronts. Many merchants are provided with photos and product descriptions by the distributor, and when you're talking about hundreds or even thousands of products, it just not feasible for a merchant to change all of the descriptions and even more so if your inventory is changing on a monthly or even weekly basis. Then all of your changes get overwritten with the new upload.
A good example would be the SMC websites that you see on late night TV where they send out a CD with products to thousands of customers and 98% of them just upload the database into their stores with little to no alteration. They won't be penalized, but they just won't be able to sell much.
In those cases, the sites aren't going to be penalized. And if those sites are ranking well without changing the content, then Google is definitely looking at other factors to make that decision (traffic, bounce rate, time on site, etc.).
The sites Google are penalizing are the ones that intentionally try to game the system by stripping content from other sites and reposting them with literally no changes at all. Also sites that try to duplicate one of their stores multiple times in a cookie cutter fashion in order to trick the system to see if they can get multiple listings on the SERPs.
You haven't provided specific sites to review for a definitive answer here, but they don't sound like they're trying to do anything black hat. They're just lazy. But if your site will be selling the same products, altering your descriptions and images is the only way that you'll get the advanatage over them instead of just becoming "yet another one of those sites". Good luck!
-
Thanks for the Amazon comment Chris :). I understand the multitude of variables when asking this question but after looking at a group of sites with similar backlink profiles, site architecture, etc. and all use duplicate product descriptions I am taken aback that they are not penalized. Even looking at smaller sites that are not properly constructed or optimized use duplicate product descriptions and still drive traffic/rank. Then I read all about rewriting product descriptions from SEOMoz and others (this information gels with what I know to be true) but then see sites still rank with this thin/dupe content.
Any thoughts?
-
That could be for a variety of reasons. Is that site the only one that is offering that particular product? Is it a highly trafficked site with a lot of backlinks, reviews, and online activity? Are the pages simply coded properly using canonical tags which help them escape "wrath"? These are all valid questions when you're doing competitive analysis and all things that Google considers along with dozens of other considerations.
Your best practice is to create new descriptions, take new photos or alter the existing ones (add text, crop, change contrast, etc.). This way your listing is seen as fresh and original content and will eventually take precedence over their carbon copy approach. If you have a better page with better content that's more informative to the customer, Google will choose your listing over 20 other sites that all have the same photos and descriptions.
Originality always wins....in most cases. Keep in mind that there are many other considerations in the Google algorithms, so don't expect to beat out Amazon no matter how hard you try.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | | Deluxe0 -
How unique should a meta description be?
I'm working on a large website (circa 25k pages) that presently just replicates each page title as a meta description. I'm thinking of doing a 'find and replace' in the database so I change: to where the preceeding and following text would be the same in each case eg Is this unique enough? Obviously the individual keyword would make it technically unique each time....and manually changing them would take the rest of my life 🙂
On-Page Optimization | | abisti21 -
Product content length & links within product description
Hello, I have questions regarding content length and links within descriptions. With our ecommerce site, we have thousands of products, each with a unique description. In the product description, I have links to the parent category and grandparent category (if it has one) in the main product text which is generally about 175 words. Then I have a last paragraph that's about 75 words that includes links to our main homepage and our main product catalogue page. Is the content length long enough? I used to use text that was 500 words, and shortening it I still rank when launching new products, so I don't think an increase in text length will have any additional benefit. I do see conflicting information when I do searches, with some people recommending a minimum of 300 words and some saying to try and go a 1000 for category pages. In regards to the links, I noticed a competitor has stopped following this format, so I'm unsure if I should keep going too. Is it too many links to have each of the products link back to the main catalogue and homepage? Is it good to have links with anchor text to the categories a product is in? There are breadcrumbs on the page with these links already. There are already have heaps of links on our pages (footer, and a right sidebar with image links to relevant categories), so my pages do get flagged for too many links. Thanks!
On-Page Optimization | | JustinBSLW0 -
Different meta-description per country?
I have this .com domain which is the corporate website. Next to this domain, we also have local domains. We would like to test with a different meta-description per country on this one corporate .com domain. Does anyone knows if this is possible and how we could integrate this?
On-Page Optimization | | WeAreDigital_BE0 -
Duplicate content penalty
when moz crawls my site they say I have 2x the pages that I really have & they say I am being penalized for duplicate content. I know years ago I had my old domain resolve over to my new domain. Its the only thing that makes sense as to the duplicate content but would search engines really penalize me for that? It is technically only on 1 site. My business took a significant sales hit starting early July 2013, I know google did and algorithm update that did have SEO aspects. I need to resolve the problem so I can stay in business
On-Page Optimization | | cheaptubes0 -
Do I need a Meta description for every page?
HI Guys, We have just developed a new website and I'm looking to add meta descriptions with relevant key words to the pages . As the site has over 80 pages it is quite an undertaking and i was wandering if pages, such as the shopping cart and FAQ's etc, need meta descriptions as well? Thanks in advance : ) Pete
On-Page Optimization | | dawsonski0 -
Avoiding "Duplicate Page Title" and "Duplicate Page Content" - Best Practices?
We have a website with a searchable database of recipes. You can search the database using an online form with dropdown options for: Course (starter, main, salad, etc)
On-Page Optimization | | smaavie
Cooking Method (fry, bake, boil, steam, etc)
Preparation Time (Under 30 min, 30min to 1 hour, Over 1 hour) Here are some examples of how URLs may look when searching for a recipe: find-a-recipe.php?course=starter
find-a-recipe.php?course=main&preperation-time=30min+to+1+hour
find-a-recipe.php?cooking-method=fry&preperation-time=over+1+hour There is also pagination of search results, so the URL could also have the variable "start", e.g. find-a-recipe.php?course=salad&start=30 There can be any combination of these variables, meaning there are hundreds of possible search results URL variations. This all works well on the site, however it gives multiple "Duplicate Page Title" and "Duplicate Page Content" errors when crawled by SEOmoz. I've seached online and found several possible solutions for this, such as: Setting canonical tag Adding these URL variables to Google Webmasters to tell Google to ignore them Change the Title tag in the head dynamically based on what URL variables are present However I am not sure which of these would be best. As far as I can tell the canonical tag should be used when you have the same page available at two seperate URLs, but this isn't the case here as the search results are always different. Adding these URL variables to Google webmasters won't fix the problem in other search engines, and will presumably continue to get these errors in our SEOmoz crawl reports. Changing the title tag each time can lead to very long title tags, and it doesn't address the problem of duplicate page content. I had hoped there would be a standard solution for problems like this, as I imagine others will have come across this before, but I cannot find the ideal solution. Any help would be much appreciated. Kind Regards5