How should I manage duplicate content caused by a guided navigation for my e-commerce site?
-
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
-
This was exactly what I was looking for. Thank you very much you have really helped me out.
-
Hi there,
My former agency has a good post on pagination that you might find useful: http://www.ayima.com/seo-knowledge/conquering-pagination-guide.html
You definitely want to cut down on duplicate content as much as possible - let me know if that post does the trick for the ecommerce question!
Cheers
-
Hi David,
I would like to give you an article at hand:
Maybe you noticed it already? It
s hard to give you a recommendation for the refinement levels... in general I would advise you to be very careful with that... to me it sounds not so bad what you
ve done so far... -
You are absolutely right about nofollow overuse being a trust factor. I had not thought about that aspect of this issue, and thank you for bringing it up. In regards to using canonical and rel prev / next, I am not sure what an implementation of this would look like. I added in rel canonical pointing to the www version of the page URL without any unnecessary parameters, and I am familiar with the idea of having a "Show All" page so as to avoid pagination (we added in our pagination parameters into Google Webmaster Tools instead). Would you recommend using canonical to roll up results pages to a category and parent refinement level, and if so how many refinements would you recommend before drawing the line?
Thank you again,
David
-
The only differentiation (if there is any) you can make when it comes up to DC is between partial and "normal" DC... keep in mind that any type (!!!) of DC won`t do your site any good! Avoid DC whenever and wherever you can! Under all circumstances... I do not know Endeca but dealing with DC caused by a navigational structure is a serious problem, especially within a shop system.
There are differnt ways to fight DC or to confine it... most common is rel=prev/next or rel=canonical... these are alternatives and never perfect solutions but there are lots of scenarios where this is a big help.
I would be careful with follow and nofollow... if you let the robot follow everything this might lead to lots of errors in the scenario you describe but on the other hand setting many URLs to nofollow can also harm your site because it`s not a very trustworthy signal for Google
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEM Rush & Duplicate content
Hi SEMRush is flagging these pages as having duplicate content, but we have rel = next etc implemented: https://www.key.co.uk/en/key/brand/bott https://www.key.co.uk/en/key/brand/bott?page=2 Or is it being flagged as they're just really similar pages?
Intermediate & Advanced SEO | | BeckyKey0 -
Duplicate content. Competing for rank.
Scenario: An automotive dealer lists cars for sale on their website. The descriptions are very good and in depth at 1,200 words per car. However chunks of the copy are copied from car review websites and weaved into their original copy. Q1: This is flagged in copyscape - how much of an issue is this for Google? Q2: The same stock with the same copy is fed into a popular car listing website - the dealer's website and the classifieds website often rank in the top two positions (sometimes the dealer on top other times the classifieds site). Is this a good or a bad thing? Are you risking being seen as duplicating/scraping content? Thank you.
Intermediate & Advanced SEO | | Bee1590 -
HELP! How does one prevent regional pages as being counted as "duplicate content," "duplicate meta descriptions," et cetera...?
The organization I am working with has multiple versions of its website geared towards the different regions. US - http://www.orionhealth.com/ CA - http://www.orionhealth.com/ca/ DE - http://www.orionhealth.com/de/ UK - http://www.orionhealth.com/uk/ AU - http://www.orionhealth.com/au/ NZ - http://www.orionhealth.com/nz/ Some of these sites have very similar pages which are registering as duplicate content, meta descriptions and titles. Two examples are: http://www.orionhealth.com/terms-and-conditions http://www.orionhealth.com/uk/terms-and-conditions Now even though the content is the same, the navigation is different since each region has different product options / services, so a redirect won't work since the navigation on the main US site is different from the navigation for the UK site. A rel=canonical seems like a viable option, but (correct me if I'm wrong) it tells search engines to only index the main page, in this case, it would be the US version, but I still want the UK site to appear to search engines. So what is the proper way of treating similar pages accross different regional directories? Any insight would be GREATLY appreciated! Thank you!
Intermediate & Advanced SEO | | Scratch_MM0 -
E-commerce site structure & link juice: Bouncing off an idea
Hi guys, Question from a new-comer in SEO. Summary of the situation: potential customers are searching for a generic product category (buy mountainbike) more often than a brand in that category (Specialized MTB). And the latter is searched more often than a specific product ('some specific product from Specialized brand'). Both the brand pages and product pages are not ranking good Then would it be a good idea to have the category pages only link to the brand pages? They may show the products, but the links wouldn't pass link juice. I'm not even sure if that is technically possible, but I wanted to figure out the merit first. I'm hoping this would support the brand pages to rank better as they take in more volume. Please do feel free to teach me!
Intermediate & Advanced SEO | | Peter850 -
Why are these pages considered duplicate content?
I have a duplicate content warning in our PRO account (well several really) but I can't figure out WHY these pages are considered duplicate content. They have different H1 headers, different sidebar links, and while a couple are relatively scant as far as content (so I might believe those could be seen as duplicate), the others seem to have a substantial amount of content that is different. It is a little perplexing. Can anyone help me figure this out? Here are some of the pages that are showing as duplicate: http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Seth+Green/?bioid=5554 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Solomon+Northup/?bioid=11758 http://www.downpour.com/catalogsearch/advanced/byNarrator/?mediatype=audio+books&bioid=3665 http://www.downpour.com/catalogsearch/advanced/byAuthor/author/Marcus+Rediker/?bioid=10145 http://www.downpour.com/catalogsearch/advanced/byNarrator/narrator/Robin+Miles/?bioid=2075
Intermediate & Advanced SEO | | DownPour0 -
PDF for link building - avoiding duplicate content
Hello, We've got an article that we're turning into a PDF. Both the article and the PDF will be on our site. This PDF is a good, thorough piece of content on how to choose a product. We're going to strip out all of the links to our in the article and create this PDF so that it will be good for people to reference and even print. Then we're going to do link building through outreach since people will find the article and PDF useful. My question is, how do I use rel="canonical" to make sure that the article and PDF aren't duplicate content? Thanks.
Intermediate & Advanced SEO | | BobGW0 -
Capitals in url creates duplicate content?
Hey Guys, I had a quick look around however I couldn't find a specific answer to this. Currently, the SEOmoz tools come back and show a heap of duplicate content on my site. And there's a fair bit of it. However, a heap of those errors are relating to random capitals in the urls. for example. "www.website.com.au/Home/information/Stuff" is being treated as duplicate content of "www.website.com.au/home/information/stuff" (Note the difference in capitals). Anyone have any recommendations as to how to fix this server side(keeping in mind it's not practical or possible to fix all of these links) or to tell Google to ignore the capitalisation? Any help is greatly appreciated. LM.
Intermediate & Advanced SEO | | CarlS0 -
Duplicate Content
Hi everyone, I have a TLD in the UK with a .co.uk and also the same site in Ireland (.ie). The only differences are the prices and different banners maybe. The .ie site pulls all of the content from the .co.uk domain. Is this classed as content duplication? I've had problems in the past in which Google struggles to index the website. At the moment the site appears completely fine in the UK SERPs but for Ireland I just have the Title and domain appearing in the SERPs, with no extended title or description because of the confusion I caused Google last time. Does anybody know a fix for this? Thanks
Intermediate & Advanced SEO | | royb0