Duplicate Content for Multiple Instances of the Same Product?
-
Hi again! We're set to launch a new inventory-based site for a chain of car dealers with various locations across the midwest. Here's our issue:
The different branches have overlap in the products that they sell, and each branch is adamant that their inventory comes up uniquely in site search.
We don't want the site to get penalized for duplicate content; however, we don't want to implement a link rel=canonical because each product should carry the same weight in search.
We've talked about having a basic URL for these product descriptions, and each instance of the inventory would be canonicalized to this main product, but it doesn't really make sense for the site structure to do this.
Do you have any tips on how to ensure that these products (same description, new product from manufacturer) won't be penalized as duplicate content?
-
Yeah, no argument there. I worry about it from an SEO standpoint, but sometimes there really isn't a lot you can do, from a business standpoint. I think it's occasionally worth a little fight, though - sometimes, when all the dealers want to have their cake and eat it, too, they all suffer (at least, post-Panda). Admittedly, that's a long, difficult argument, and you have to decide if it's worth the price.
-
okunen,
When you say "overlap in the products that they sell", do they have two identical franchises e.g. 2 Toyota stores that are on opposite sides of the same city, or are they wanting to share pre-owned inventory across multiple sites?
-
Dr. Pete, I think we are on the same page. The reason that I say don't worry too much about duplicate content when it comes to dealer inventory is; for the most part it is out of your control in the automotive industry. Dealer's want to have their inventory on as many sites as they can so it becomes virtually impossible to control.
-
I have to disagree with Mike a bit - this is the kind of situation that can cause problems, and I think the duplication across the industry actually makes it even more likely. Yes, the big players can get away with it, and Google understands the dynamic to some degree, but if you have a new site or smaller brand, you could greatly weaken your ranking ability. You especially have to be careful out the gate, IMO, when your authority is weak.
To be fair, I'm assuming you're a small to mid-sized player and not a major brand, so if that's an incorrect assumption, let me know.
There aren't many have-your-cake-and-eat-it-too approaches to duplicate content in 2013. If you use rel=canonical, NOINDEX, etc. then some version of the page won't be eligible for ranking. If you don't, then the pages could dilute each other or even harm the ranking of the overall site. Each product won't "carry the same weight in search" - if you don't pick, Google will, and your internal site architecture and inbound link structure is always going to weight some pages more highly than others. Personally, I think it's better to choose than have the choice made for you (which is usually what happens).
I'd also wonder if this structure is really that great for users - people don't want to happen across nine versions of the same page, that only differ by the branch. The branch is your construct, not theirs, and it's important to view this from the visitor perspective.
Unfortunately, I don't understand the business/site well enough to give you a great alternative. Is there a way to create a unified product URL/page, but still give the branch credit when a visitor hits the product via their sub-site. For example, you could cookie the visitor and then show the branches template (logo, info, etc.) at the top of the page, but still keep one default URL that Google would see. As long as new visitors to the site also see that default, it's not a problem.
-
Don't worry so much about the duplicate content. Those same cars are probable on 5 to 10 other sites anyway; e.g. cars.com, Autotrader, etc. The Search Engines understand the automotive industry dynamic pretty well.
Focus on location, content, and authority. By content, if you can get them to add unique descriptions to each vehicle and if possible unique text on the inventory search pages. Then of course relevant blog posts.
Depending on how much control you have over the inventory SEO you should be able to make the meta title/descriptions unique between the different dealers too.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the Impact of Duplicate Content on Multiple Managed Property Domains?
Hi Moz Community! Our team is having an internal (and external) debate regarding the extent and implications of duplicate content for a hospitality client that I would love to get some feedback on. I unfortunately cannot divulge the brand/URL, but will give as much info as possible. The brand in question manages dozens of properties in the US and worldwide and has recently rolled up all of the domains under a singular brand.com domain. So whereas the properties used to have their own domains (property1.com, property2.com, etc...), they are now housed in sub-folders (brand.com/property1, brand.com/property2.com and so forth). The concern we have is that they launched the new brand site with all of the property sites/content rolled up under the new brand.com domain, however all of the individual property sites and their pages are still live as well. All of the canonicals on both brand.com as well as property1.com (property2.com, property3.com, etc...) are self-referencing (so the canonicals for brand.com/property1 and all of its sub-ages do not point to the still live property1.com and all of its sub-pages, for example). On the brand side, they believe this is the best path forward as brand.com grows and gains some authority, with the later intent on eventually redirecting the individual property domains - but we are unclear of that timeline (though we do think its more months as opposed to days/weeks) So our questions for the community here are: What is the perceived impact in this state of limbo to the individual property sites (ideally they house the original content and have the history, but could Google still give preference to the brand.com/property URLs and/or could both of them suffer in rank/search experience from the duplicate content an non-uniform presentation?) Could brand.com be "dinged" so-to-speak due to launching with this much duplicate content? (And if so, could that affect how quickly normalization occurs after the property sites are finally redirected?) Anything else we should consider/Any other feedback from the community? Thank you all for your time and support!
Technical SEO | | imiJoe0 -
Duplicate Content Issues with Pagination
Hi Moz Community, We're an eCommerce site so we have a lot of pagination issues but we were able to fix them using the rel=next and rel=prev tags. However, our pages have an option to view 60 items or 180 items at a time. This is now causing duplicate content problems when for example page 2 of the 180 item view is the same as page 4 of the 60 item view. (URL examples below) Wondering if we should just add a canonical tag going to the the main view all page to every page in the paginated series to get ride of this issue. https://www.example.com/gifts/for-the-couple?view=all&n=180&p=2 https://www.example.com/gifts/for-the-couple?view=all&n=60&p=4 Thoughts, ideas or suggestions are welcome. Thanks
Technical SEO | | znotes0 -
Content relaunch without content duplication
We write great Content for blog and websites (or at least we try), especially blogs. Sometimes few of them may NOT get good responses/reach. It could be the content which is not interesting, or the title, or bad timing or even the language used. My question for the discussion is, what will you do if you find the content worth audience's attention missed it during its original launch. Is that fine to make the text and context better and relaunch it ? For example: 1. Rechristening the blog - Change Title to make it attractive
Technical SEO | | macronimous
2. Add images
3. Check spelling
4. Do necessary rewrite, spell check
5. Change the timeline by adding more recent statistics, references to recent writeups (external and internal blogs for example), change anything that seems outdated Also, change title and set rel=cannoical / 301 permanent URLs. Will the above make the blog new? Any ideas and tips to do? Basically we like to refurbish (:-)) content that didn't succeed in the past and relaunch it to try again. If we do so will there be any issues with Google bots? (I hope redirection would solve this, But still I want to make sure) Thanks,0 -
Duplicated content in news portal: should we use noindex?
Hello, We have a news portal, and like other newspapers we have our own content and content from other contributors. Both our content and our contributors content can be found in other websites (we sell our content and they give theirs to us). In this regard, everything seems to work fine from the business and users perspective. The problem is that this means duplicated content... so my question is: "Should we add the noindex,nofollow" tag to these articles? Notice that there might be hundreds of articles everyday, something like a 1/3 of the website. I checked one newspaper which uses news from agencies, but they seem not to use any noindex tag. Not sure what others do. I would appreciate any opinion on that.
Technical SEO | | forex-websites0 -
301 duplicate content dynamic url
I have a number of pages that appear as duplicate titles in google webmaster. They all have to do with a brand name query. I want to 301 these pages since I'm going to relaunch my new website on wordpress and don't want to have 404s on these pages. a simple 301 redirect doesn't work since they are dynamic urls. here is an example: /kidsfashionnetherlands/mimpi.html?q=brand%3Amim+pi%3A&page=2&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=3&sort=relevance /kidsfashionnetherlands/mimpi.html?q=mim+pi&page=5&sort=relevance should all be 301 to the original page that I want to remain indexed: /kidsfashionnetherlands/mimpi.html I have a lot of these but for different queries. Should I do a 301 on each of them to avoid having 404s when I change my site to wordpress? Thanks
Technical SEO | | dashinfashion0 -
Duplicate Content on SEO Pages
I'm trying to create a bunch of content pages, and I want to know if the shortcut I took is going to penalize me for duplicate content. Some background: we are an airport ground transportation search engine(www.mozio.com), and we constructed several airport transportation pages with the providers in a particular area listed. However, the problem is, sometimes in a certain region multiple of the same providers serve the same places. For instance, NYAS serves both JFK and LGA, and obviously SuperShuttle serves ~200 airports. So this means for every airport's page, they have the super shuttle box. All the provider info is stored in a database with tags for the airports they serve, and then we dynamically create the page. A good example follows: http://www.mozio.com/lga_airport_transportation/ http://www.mozio.com/jfk_airport_transportation/ http://www.mozio.com/ewr_airport_transportation/ All 3 of those pages have a lot in common. Now, I'm not sure, but they started out working decently, but as I added more and more pages the efficacy of them went down on the whole. Is what I've done qualify as "duplicate content", and would I be better off getting rid of some of the pages or somehow consolidating the info into a master page? Thanks!
Technical SEO | | moziodavid0 -
Does turning website content into PDFs for document sharing sites cause duplicate content?
Website content is 9 tutorials published to unique urls with a contents page linking to each lesson. If I make a PDF version for distribution of document sharing websites, will it create a duplicate content issue? The objective is to get a half decent link, traffic to supplementary opt-in downloads.
Technical SEO | | designquotes0 -
Duplicate Page Content and Titles
A few weeks ago my error count went up for Duplicate Page Content and Titles. 4 errors in all. A week later the errors were gone... But now they are back. I made changes to the Webconfig over a month ago but nothing since. SEOmoz is telling me the duplicate content is this http://www.antiquebanknotes.com/ and http://www.antiquebanknotes.com Thanks for any advise! This is the relevant web.config. <rewrite><rules><rule name="CanonicalHostNameRule1"><match url="(.*)"><conditions><add input="{HTTP_HOST}" pattern="^www.antiquebanknotes.com$" negate="true"></add></conditions>
Technical SEO | | Banknotes
<action type="Redirect" url="<a href=" http:="" www.antiquebanknotes.com="" {r:1"="">http://www.antiquebanknotes.com/{R:1}" />
</action></match></rule>
<rule name="Default Page" enabled="true" stopprocessing="true"><match url="^default.aspx$"><conditions logicalgrouping="MatchAll"><add input="{REQUEST_METHOD}" pattern="GET"></add></conditions>
<action type="Redirect" url="/"></action></match></rule></rules></rewrite>0