Duplicate Content on Product Pages
-
I'm getting a lot of duplicate content errors on my ecommerce site www.outdoormegastore.co.uk mainly centered around product pages.
The products are completely different in terms of the title, meta data, product descriptions and images (with alt tags)but SEOmoz is still identifying them as duplicates and we've noticed a significant drop in google ranking lately.
Admittedly the product descriptions are a little bit thin but I don't understand why the pages would be viewed as duplicates and therefore can be ranked lower? The content is definitely unique too.
As an example these three pages have been identified as being duplicates of each other.
http://www.outdoormegastore.co.uk/regatta-landtrek-25l-rucksack.html
http://www.outdoormegastore.co.uk/canyon-bryce-adult-cycling-helmet-9045.html
http://www.outdoormegastore.co.uk/outwell-minnesota-6-carpet-for-green-07-08-tent.html
-
No, I don't have exact match domain. But, I have added duplicate content on too many product pages since last two week. I believe that, Google have crawled all product pages which contain duplicate content. And, I am getting issue regarding ranking.
-
It maybe the duplicate content, but you say last 3 days?
You don't have a exact match domain do you?
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
I want to discuss more on this sentence. I am trying to establish user friendly experience on my eCommerce website by adding specific product description from manufacturer website to my website.
But, I am getting issue in ranking since last 3 days. My major keywords Office Chairs, Patio Umbrellas & Table Lamps are going down around -10 position. I have drill down more on it and come to know about this issue.
I have added product description from manufacturer website to my website in these category products. I am 100% sure, Google is giving rank drop due to this issue.
-
Hi!
Actually you are doing well, as creating original product pages is the best option in order to avoid duplicated content issues.
If I was you, I would eventually see how to implement a UGC review option, in order to have product pages becoming even more unique with the pass of time.
About Overstock... sincerely I cannot give you an answer why it doesn't seem as suffering consequences for its duplications. It would be needed a deeper investigation, for which I don't have the time right now.
-
This is very good discussion on duplicate content. Ranking is going down on my website since last 3 days.
I have done drill down with SEO latest update and come to know about duplicate content on product pages.
My website contain true duplicate content on product page. But, I have found duplicate content on competitor website. (overstock)
They are not getting any issue regarding ranking. But, my certain category level page getting issue with ranking. I have added true duplicate in more than 2000 pages from manufacturer website to my website.
I am going to recover it by removing duplicate content or adding unique content on product page. Is there any additional inputs from SEOmoz users?
Competitor website:
My website:
Manufacturer website:
-
Hi Gavin,
Just to clarify, SEOmoz flags your content as duplicate if finds 95% HTML similarity. You can use an online tool to compare pages yourself. I like this one:
http://www.webconfs.com/similar-page-checker.php
Google obviously uses a more sophisticated method than Moz, but it's still a good warning because pages without much unique content - even if they aren't true duplicates - often have a difficult time ranking for their targeted keywords.
-
Good catch!
-
Just body.
You need a product template, this will make it easier. If you visit any major eCommerce website you will see every product has the same layout.
So something like...
Title of product > Short description > spec > FAQ's > etc
This is just an answer to a question on a forum and looks like a 100 or so words right here, you could have a FAQ's section on the products and just make the questions up and answer them.
http://uk.answers.yahoo.com/question/index?qid=20090904093654AA2XDud
Always ways of creating content just need to have a good think and something will come up.
-
Thanks for the reply..
Would the 300 unique words be spread across just body content or would it include meta words too?
It's going to be difficult describing tent pegs in 300 words or more!
-
Hello
The reason these are coming up as duplicate content is due to the thin content. You need at least 300 unique words on each page to make good content and as you dont have as many words on these pages it is classing as a duplicate.
If you add more to each description then this will change and hopefully your rankings.
Good luck
-
The thin content you do have is on multiple pages on your domain.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Web accessibility - High Contrast web pages, duplicate content and SEO
Hi all, I'm working with a client who has various URL variations to display their content in High Contrast and Low Contrast. It feels like quite an old way of doing things. The URLs look like this: domain.com/bespoke-curtain-making/ - Default URL
Intermediate & Advanced SEO | | Bee159
domain.com/bespoke-curtain-making/?style=hc - High Contrast page
domain.com/bespoke-curtain-making/?style=lc - Low Contrast page My questions are: Surely this content is duplicate content according to a search engine Should the different versions have a meta noindex directive in the header? Is there a better way of serving these pages? Thanks.0 -
Duplicate H1 on single page for mobile and desktop
I have a responsive site and whilst this works and is liked by google from a user perspective the pages could look better on mobile. I have a wordpress site and use the Divi Builder with elegant themes and have developed a separate page header for mobile that uses a manipulated background image and smaller H1 font size. When crawling the site two H1s can be detected on the same page - they are exactly the same words and only one will show according to device. However, I need to know if this will cause me a problem with google and SEO. As the mobile changes are not just font size but also adaptations to some visual elements it is not something I can simply alter in the CSS. Would appreciate some input as to whether this is a problem or not
Intermediate & Advanced SEO | | Cells4Life0 -
Help with duplicate pages
Hi there, I have a client who's site I am currently reviewing prior to a SEO campaign. They still work with the development team who built the site (not my company). I have discovered 311 instances of duplicate content within the crawl report. The duplicate content appears to either be 1, 2, or 3 versions of the same pages but with differing URL's. Example: http://www.sitename.com http://sitename.com http://sitename.com/index.php And other pages follow a similar or same pattern. I suppose my question is mainly what could be causing this and how can I fix it? Or, is it something that will have to be fixed by the website developers? Thanks in advance Darren
Intermediate & Advanced SEO | | SEODarren0 -
Duplicate content question
Hi there, I work for a Theater news site. We have an issue where our system creates a chunk of duplicate content in Google's eyes and we're not sure how best to solve. When an editor produces a video, it simultaneously 1) creates a page with it's own static URL (e.g. http://www.theatermania.com/video/mary-louise-parker-tommy-tune-laura-osnes-and-more_668.html); and 2) displays said video on a public index page (http://www.theatermania.com/videos/). Since the content is very similar, Google sees them as duplicate. What should we do about this? We were thinking that one solution would to be dynamically canonicalize the index page to the static page whenever a new video is posted, but would Google frown on this? Alternatively, should we simply nofollow the index page? Lastly, are there any solutions we may have missed entirely?
Intermediate & Advanced SEO | | TheaterMania0 -
Site been plagiarised - duplicate content
Hi, I look after two websites, one sells commercial mortgages the other sells residential mortgages. We recently redesigned both sites, and one was moved to a new domain name as we rebranded it from being a trading style of the other brand to being a brand in its own right. I have recently discovered that one of my most important pages on the residential mortgages site is not in Google's index. I did a bit of poking around with Copyscape and found another broker has copied our page almost word-for-word. I then used copyscape to find all the other instances of plagiarism on the other broker's site and there are a few! It now looks like they have copied pages from our commercial mortgages site as well. I think the reason our page has been removed from the index is that we relaunced both these sites with new navigation and consequently new urls. Can anyone back me up on this theory? I am 100% sure that our page is the original version because we write everything in-house and I check it with copyscape before it gets published, Also the fact that this other broker has copied from several different sites corroborates this view. Our legal team has written two letters (not sent yet) - one to the broker and the other to the broker's web designer. These letters ask the recipient to remove the copied content within 14 days. If they do remove our content from our site, how do I get Google to reindex our pages, given that Google thinks OUR pages are the copied ones and not the other way around? Does anyone have any experience with this? Or, will it just happen automatically? I have no experience of this scenario! In the past, where I've found duplicate content like this, I've just rewritten the page, and chalked it up to experience but I don't really want to in this case because, frankly, the copy on these pages is really good! And, I don't think it's fair that someone else could potentially be getting customers that were persuaded by OUR copy. Any advice would be greatly appreciated. Thanks, Amelia
Intermediate & Advanced SEO | | CommT0 -
Noindexing Duplicate (non-unique) Content
When "noindex" is added to a page, does this ensure Google does not count page as part of their analysis of unique vs duplicate content ratio on a website? Example: I have a real estate business and I have noindex on MLS pages. However, is there a chance that even though Google does not index these pages, Google will still see those pages and think "ah, these are duplicate MLS pages, we are going to let those pages drag down value of entire site and lower ranking of even the unique pages". I like to just use "noindex, follow" on those MLS pages, but would it be safer to add pages to robots.txt as well and that should - in theory - increase likelihood Google will not see such MLS pages as duplicate content on my website? On another note: I had these MLS pages indexed and 3-4 weeks ago added "noindex, follow". However, still all indexed and no signs Google is noindexing yet.....
Intermediate & Advanced SEO | | khi50 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0