Duplicate Content... Really?
-
Hi all,
My site is www.actronics.eu
Moz reports virtually every product page as duplicate content, flagged as HIGH PRIORITY!.
I know why.
Moz classes a page as duplicate if >95% content/code similar.
There's very little I can do about this as although our products are different, the content is very similar, albeit a few part numbers and vehicle make/model.
Here's an example:
http://www.actronics.eu/en/shop/audi-a4-8d-b5-1994-2000-abs-ecu-en/bosch-5-3
http://www.actronics.eu/en/shop/bmw-3-series-e36-1990-1998-abs-ecu-en/ate-34-51Now, multiply this by ~2,000 products X 7 different languages and you'll see we have a big dupe content issue (according to Moz's Crawl Diagnostics report).
I say "according to Moz..." as I do not know if this is actually an issue for Google? 90% of our products pages rank, albeit some much better than others?
So what is the solution? We're not trying to deceive Google in any way so it would seem unfair to be hit with a dupe content penalty, this is a legit dilemma where our product differ by as little as a part number.
One ugly solution would be to remove header / sidebar / footer on our product pages as I've demonstrated here - http://woodberry.me.uk/test-page2-minimal-v2.html since this removes A LOT of page bloat (code) and would bring the page difference down to 80% duplicate.
(This is the tool I'm using for checking http://www.webconfs.com/similar-page-checker.php)Other "prettier" solutions would greatly appreciated. I look forward to hearing your thoughts.
Thanks,
Woody -
Hey David
Thanks for reply.
3. Use a plugin to apply rich snippet markup to the individual product pages, adding another layer of "uniqueness"
I had thought about this already and was looking into the MPN (Manufacturer Part Number) attribute for products (https://schema.org/mpn) however, it's not clear if, like SKU, the MPN needs to be unique to ProductModel (https://schema.org/ProductModel)?
If that were the case, I'd have a problem as there are multiple MPN's per ProductModel.
I see https://schema.org/isVariantOf too, which could be useful?
Anyone with experience of Schema?
-
First, why were you looking at the reports? Have you seen some type of ranking loss that you are trying to remedy?
Second, the moz tools are just tools to provide you with an oversight on where you are at, and potential areas your site can be improved. They work, but are not dedicated to any one type of website i.e. e-commerce vs static or content-based.
To get the unique pages you seek, it may be possible to use javascript to load content for variables of part numbers. As stated before, your site is getting seen as duplicate due to only a few things changing out per page.
Possible fixes:
1. Use dynamic coding to load part number variables, such as drop down menus for alternate versions or parts or models. This will allow you fewer pages to direct your backlinks to as well.2. Have more top level pages based around the category, and focus on getting the category pages ranking rather than the individual part pages. Again, focus your backlinking efforts on these pages.
3. Use a plugin to apply rich snippet markup to the individual product pages, adding another layer of "uniqueness"
-
The pages were not intended strictly for SEO value, they were mainly built for user value, i.e. returning a 100% focused page on the part number they searched for. Remember, many people use Google as a navigational tool and they also consider the product to the the part no. they searched for, not the main manufacturer of the product (ATE).
I understand what you are saying though and think building stronger product pages is the way to go, although I will try on a subset of pages and monitor results.
Now to decide which approach to take to yield the best results:
a.) SEO focus on ATE MK70 (list all the vehicle makes/models/years this product work on, including list of part numbers)
or...
b.) SEO focus on vehicle makes/model (then list all the manufacturers of suitable products, with corresponding part numbers)Thanks,
Woody -
This is one of the things Panda was trying to discourage (creating pages strictly for SEO value as opposed to user value that have thin content).
Consolidating and building out a single page is the way to go. Google will still crawl the product numbers, and they will be on a much stronger page. Even if they're not in the URL and title, a more valuable page nearly always wins out.
Not only that, you're playing with fire right now. If you haven't been hit by Panda yet, your odds are much higher with the numerous little pages.
-
Thanks guys
William
What's the thought process of creating a bunch of new pages, even though it's the same product, just referred to differently by different companies? Just for the unique URLs and titles?
Samuel
Would you want to create a separate page for "red Honda Civic," "green Honda civic," and countless other colors? Of course not.
To hopefully address both questions with one answer; the reason for building separate pages was to give SEO focus to the unique part numbers and the product type by vehicle make / model / year.
Very few people in the industry search for the product by name, it's always by part number. In fact, I'd go as far as to say there's few who would actually know the brand of "the product", that being ATE MK70 in our example above.
I understand the logic of building a strong single product page with all these part numbers listed, but would this page really rank well for searches on part number? Bear in mind, unlike the red, green, blue Honda Civic example, where there's perhaps a dozen different colours, we're talking literally 100's of part numbers per product and variations of it's formatting.
I welcome further conversation and ideas on this
Thanks so far guys! -
Thanks for the question. I'm not able to go through your site at the moment, but I would ask: Do you really need a separate page for every single make, model, and part number? Correct me if I'm wrong, but this seems to be what you're doing. If so, you're just asking for a Panda penalty.
Here's a basic example: Say that you sell Honda Civics. Would you want to create a separate page for "red Honda Civic," "green Honda civic," and countless other colors? Of course not. All of the content would be entirely the same except for the listed color throughout each title and page's text.
I'd take a look at Amazon as an example. Say that I go to a page for a certain T-shirt. The same page for that individual product will include all of the color variations w_ithin that single product page_. Each color variation is not a new page and URL (or if it is, it has a rel=canonical tag back to the main product page -- I don't remember). I'd look to this example as a way that you can vastly cut down the number of product pages so that each one is truly unique, valuable, and useful to both search engines and customers.
I hope that helps -- good luck!
-
I think you're already in Panda territory. The content can't get much thinner. It seems like all those sub-pages that are linked to on the page you just shared are unnecessary, no? Couldn't you just have the one page, build it out with the cars it works in, maybe a diagram or instruction on how to put it in, and make a really valuable page?
What's the thought process of creating a bunch of new pages, even though it's the same product, just referred to differently by different companies? Just for the unique URLs and titles?
Consolidating all of that would eliminate thin content and likely strengthen your landing page exponentially.
-
Thank you for your answer William and taking the time to respond,
I understand what you are saying but I am a little skeptical as that being a logical/achievable solution?
Let's say we did write some content for each product, the content would be "thin" to say the least.
As an example, we have over 700 products (per language), this being on of them - http://www.actronics.eu/en/shop/product/ate-mk70
This product alone works in over 43 different vehicle marques, illustrated in the list of on the page.
The only thing different about them is the part number, i.e. what the manufacturer refers to this part as (Audi A3 refer to it as 10097003153, Peugeot 206 refer to it as 9659136980). There really is nothing more to say about the product, without creating more dupe content and getting into Panda territory, so I don't see this being a viable solution?
We have the pages in place as mechanics/garages search by manufactures number, not product type.
Any more thoughts/ideas?
-
This issue isn't duplicate content, Moz is just flagging it as that because of the severe lack of content, making the footer, sidebar, etc. the majority of the content on the page. This is not good, and the best way to remedy it would be to build out more content.
I realize with roughly 14k pages, this isn't realistic to do for every single page, but you could prioritize. What are your most popular products? Start with those and build out content to make sure they rank and perform as well as possible, and then continue to go down the list as you have time to do so, manually optimizing and building out the most profitable/popular pages first.
When it comes to unique content, there is no automated solution. Either you write stuff, hire someone else to write stuff, or do what a lot of places do: implements a review system for customers to use and crowd-source the unique content that way.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Best method for blocking a subdomain with duplicated content
Hello Moz Community Hoping somebody can assist. We have a subdomain, used by our CMS, which is being indexed by Google.
Intermediate & Advanced SEO | | KateWaite
http://www.naturalworldsafaris.com/
https://admin.naturalworldsafaris.com/ The page is the same so we can't add a no-index or no-follow.
I have both set up as separate properties in webmaster tools I understand the best method would be to update the robots.txt with a user disallow for the subdomain - but the robots text is only accessible on the main domain. http://www.naturalworldsafaris.com/robots.txt Will this work if we add the subdomain exclusion to this file? It means it won't be accessible on https://admin.naturalworldsafaris.com/robots.txt (where we can't create a file). Therefore won't be seen within that specific webmaster tools property. I've also asked the developer to add a password protection to the subdomain but this does not look possible. What approach would you recommend?0 -
Ticket Industry E-commerce Duplicate Content Question
Hey everyone, How goes it? I've got a bunch of duplicate content issues flagged in my Moz report and I can't figure out why. We're a ticketing site and the pages that are causing the duplicate content are for events that we no longer offer tickets to, but that we will eventually offer tickets to again. Check these examples out: http://www.charged.fm/mlb-all-star-game-tickets http://www.charged.fm/fiba-world-championship-tickets I realize the content is thin and that these pages basically the same, but I understood that since the Title tags are different that they shouldn't appear to the Goog as duplicate content. Could anyone offer me some insight or solutions to this? Should they be noindexed while the events aren't active? Thanks
Intermediate & Advanced SEO | | keL.A.xT.o1 -
How do I geo-target continents & avoid duplicate content?
Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
Intermediate & Advanced SEO | | AxialDev
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-ca Link hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!0 -
Last Panda: removed a lot of duplicated content but no still luck!
Hello here, my website virtualsheetmusic.com has been hit several times by Panda since its inception back in February 2011, and so we decided 5 weeks ago to get rid of about 60,000 thin, almost duplicate pages via noindex metatags and canonical (we have no removed physically those pages from our site giving back a 404 because our users may search for those items on our own website), so we expected this last Panda update (#25) to give us some traffic back... instead we lost an additional 10-12% traffic from Google and now it looks even really badly targeted. Let me say how disappointing is this after so much work! I must admit that we still have many pages that may look thin and duplicate content and we are considering to remove those too (but those are actually giving us sales from Google!), but I expected from this last Panda to recover a little bit and improve our positions on the index. Instead nothing, we have been hit again, and badly. I am pretty desperate, and I am afraid to have lost the compass here. I am particularly afraid that the removal of over 60,000 pages via noindex metatags from the index, for some unknown reason, has been more damaging than beneficial. What do you think? Is it just a matter of time? Am I on the right path? Do we need to wait just a little bit more and keep removing (via noindex metatags) duplicate content and improve all the rest as usual? Thank you in advance for any thoughts.
Intermediate & Advanced SEO | | fablau0 -
Issue with duplicate content in blog
I have blog where all the pages r get indexed, with rich content in it. But In blogs tag and category url are also get indexed. i have just added my blog in seomoz pro, and i have checked my Crawl Diagnostics Summary in that its showing me that some of your blog content are same. For Example: www.abcdef.com/watches/cool-watches-of-2012/ these url is already get indexed, but i have asigned some tag and catgeory fo these url also which have also get indexed with the same content. so how shall i stop search engines to do not crawl these tag and categories pages. if i have more no - follow tags in my blog does it gives negative impact to search engines, any alternate way to tell search engines to stop crawling these category and tag pages.
Intermediate & Advanced SEO | | sumit600 -
Duplicate Title Tags & Duplication Meta Description after 301 Redirect
Today, I was checking my Google webmaster tools and found 16,000 duplicate title tags and duplicate meta description. I have investigate for this issue and come to know about as follow. I have changed URL structure for 11,000 product pages on 3rd July, 2012 and set up 301 redirect from old product pages to new product pages. Google have started to crawl my new product pages but, De-Indexing of old URLs are quite slower. That's why I found this issue on Google webmaster tools. Can anyone suggest me, How can I increase ratio of De-Indexing for old URLs? OR any other suggestions? How much time Google will take to De-Index old URLs from web search?
Intermediate & Advanced SEO | | CommercePundit0 -
Should I redirect all my subdomains to a single unique subdomain to eliminate duplicate content?
Hi there! I've been working on http://duproprio.com for a couple of years now. In the early stages of the website, we've put into place a subdomain wildcard, that allowed us to create urls like this on the fly : http://{some-city}.duproprio.com This brought us instantly a lot of success in terms of traffic due to the cities being great search keywords. But now, business has grown, and as we all know, duplicate content is the devil so I've been playing with the idea of killing (redirecting) all those urls to their equivalent on the root domain. http://some-city.duproprio.com/some-listing-1234 would redirect to equivalent page at : http://duproprio.com/some-listing-1234 Even if my redirections are 301 permanent, there will be some juice lost for each link redirected that are actually pointing to my old subdomains This would also imply to redirect http://www.duproprio.com to http://duproprio.com. Which is probably the part I'm most anxious about since the incoming links are almost 50/50 between those 2 subdomains... Bringing everything back into a single subdomain is the thing to do in order to get all my seo juice together, this part is obvious... But what can I do to make sure that I don't end up actually losing traffic instead of gaining authority? Can you help me get the confidence I need to make this "move" without risking to lose tons of traffic? Thanks a big lot!
Intermediate & Advanced SEO | | DuProprio.com0 -
Duplicate blog content and NOINDEX
Suppose the "Home" page of your blog at www.example.com/domain/ displays your 10 most recent posts. Each post has its own permalink page (where you have comments/discussion, etc.). This obviously means that the last 10 posts show up as duplicates on your site. Is it good practice to use NOINDEX, FOLLOW on the blog root page (blog/) so that only one copy gets indexed? Thanks, Akira
Intermediate & Advanced SEO | | ahirai0