Avoiding duplicate content with national e-commerce products and localized vendors
-
Hello 'mozzers!
For our example purposes, let's say we have a national cog reseller, www.cogexample.com, focusing on B2C cog sales. The website's SEO efforts revolve around keywords with high search volumes -- no long tail keywords here!
CogExample.com sells over 35,000 different varieties of cogs online, broken into search engine friendly categories and using both HTML and Meta pagination techniques to ensure adequate deep-linking and indexing of their individual product pages.
With their recent fiscal success, CogExample.com has signed 2,500 retailers across the United States to re-sell their cogs.
CogExample.com's primary objective is B2C online sales for their highly-sought search terms, ie "green cogs". However, CogExample.com also wants their retailers to show up for local/geo search; ie "seattle green cogs".
The geo/location-based retailer's web-content will be delivered from the same database as the primary online store, and thus is very likely to cause duplicate content issues.
Questions
1. If the canonical meta tag is used to point the geo-based product to the online primary product, the geo-based product will likely be placed in the supplementary indexed. Is this correct?
2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty?
3. Google product feeds will be used to localize content and feed Google's product search. Is this "enough" to garnish sizable amounts of traffic and/or retain SERP ranks?
-
If this solution works you have a lot of potential customers.
Here is a thought... Since google is demoting sites that have lots of duplicate content if you use name="robots" content="noindex, follow" /> on all of the duplicate pages then the pages that remain in the index will have a better chance of ranking.
There are also other ways to keep them out of the index.
-
I was asked the same question recently by a person selling Sunglasses, thousands of them all with straight copy + paste manufacturers descriptions.
The only solution I can come up with that is anywhere near feasible is to create unique landing pages for the main keywords. In your case a landing page for Green Cogs or his case a landing page for Prescription Sunglasses. You can hopefully develop a few pages to get ranked and hopefully turn them rankings into customers.
With that strategy I also recommended employing a Review Section which would hopefully turn those customers into reviewers, thus generating unique content on each product page.
That is my best solution to date...
-
2. Given the massive product database (35,000) and retailers (2,500) it is not feasible to re-write 87,500,000 pages of content to sate unique content needs. Is there any way to prevent the duplicate content penalty?
Everybody everywhere is asking this question. "I have twenty-five websites that sell the same product and I use the same product description, photos, captions, title tags, etc. on every one of them. Is there anyway to fool google into believing that these are unique?"
Your problem is 100 times larger.
Looking at the history..... Google has been killing "instant storefront" websites for the past several years
If you figure out a way to do this you will be able to make a lot more money selling the solution than you are going to make from your 35,000 products.
I think that is the money making opportunity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Cloud Hosting and Duplicate content
Hi I have an ecommerce client who has all their images cloud hosted (amazon CDN) to speed up site. Somehow it seems maybe because the pinned the images on pinterest but the CDN got indexed and there now seems to be about 50% of the site duplicated (about 2500 pages eg: http://d2rf6flfy1l.cloudfront.net..) Is this a problem with duplicate content? How come Moz doesnt show it up as crawl errors? Why is thisnot a problem that loads of people have?I only found a couple of mentions of such a prob when I googled it.. any suggestion will be grateful!
Technical SEO | | henya0 -
Joomla: content accesible through all kinds of other links >> duplicate content?!
When i did a site: search on Google i've noticed all kind of URL's on my site were indexed, while i didn't add them to the Joomla navigation (or they were not linked anywhere on the site). Some examples: www.domain.com/1-articlename >> that way ALL articles are publicly visible, even if they are not linked to a menu-item... If by accident such a link get's shared it will be indexed in google, you can have 2 links with same content... www.domain.com/2-uncategorised >> same with categories, automatically these overview pages are visible to people who know this URL. On it you see all the articles that belong to that category. www.domain.com/component/content >> this gives an overview of all the categories inside your Joomla CMS I think most will agree this is not good for your site's SEO? But how can this be solved? Is this some kind of setting within Joomla? Anyone who dealt with these problems already?
Technical SEO | | conversal0 -
Looking for a technical solution for duplicate content
Hello, Are there any technical solutions to duplicate content similar to the nofollow tag? A tag which can indicate to Google that we know that this is duplicate content but we want it there because it makes sense to the user. Thank you.
Technical SEO | | FusionMediaLimited0 -
Duplicate page content
Hello, My site is being checked for errors by the PRO dashboard thing you get here and some odd duplicate content errors have appeared. Every page has a duplicate because you can see the page and the page/~username so... www.short-hairstyles.com is the same as www.short-hairstyles.com/~wwwshor I don't know if this is a problem or how the crawler found this (i'm sure I have never linked to it). But I'd like to know how to prevent it in case it is a problem if anyone knows please? Ian
Technical SEO | | jwdl0 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
Duplicate Page Titles and Content
I have a site that has a lot of contact modules. So basically each section/page has a contact person and when you click the contact button it brings up a new window with form to submit and then ends with a thank you page. All of the contact and thank you pages are showing up as duplicate page titles and content. Is this something that needs to be fixed even if I am not using them to target keywords?
Technical SEO | | AlightAnalytics0 -
Using robots.txt to deal with duplicate content
I have 2 sites with duplicate content issues. One is a wordpress blog. The other is a store (Pinnacle Cart). I cannot edit the canonical tag on either site. In this case, should I use robots.txt to eliminate the duplicate content?
Technical SEO | | bhsiao0 -
E-Commerce Categorization
I'm working on an e-commerce site that currently has about 50 root categories and growing, with no sub-categories. They are all linked from the sidebar of every page and all the products are pretty related. They could probably be sub-categorized in to 5 root categories. At want point does categorization become too flat?
Technical SEO | | BlinkWeb0