How should I manage duplicate content caused by a guided navigation for my e-commerce site?
-
I am working with a company which uses Endeca to power the guided navigation for our e-commerce site. I am concerned that the duplicate content generated by having the same products served under numerous refinement levels is damaging the sites ability to rank well, and was hoping the Moz community could help me understand how much of an impact this type of duplicate content could be having. I also would love to know if there are any best practices for how to manage this type of navigation. Should I nofollow all of the URLs which have more than 1 refinement used on a category, or should I allow the search engines to go deeper than that to preserve the long tail? Any help would be appreciated. Thank you.
-
This was exactly what I was looking for. Thank you very much you have really helped me out.
-
Hi there,
My former agency has a good post on pagination that you might find useful: http://www.ayima.com/seo-knowledge/conquering-pagination-guide.html
You definitely want to cut down on duplicate content as much as possible - let me know if that post does the trick for the ecommerce question!
Cheers
-
Hi David,
I would like to give you an article at hand:
Maybe you noticed it already? It
s hard to give you a recommendation for the refinement levels... in general I would advise you to be very careful with that... to me it sounds not so bad what you
ve done so far... -
You are absolutely right about nofollow overuse being a trust factor. I had not thought about that aspect of this issue, and thank you for bringing it up. In regards to using canonical and rel prev / next, I am not sure what an implementation of this would look like. I added in rel canonical pointing to the www version of the page URL without any unnecessary parameters, and I am familiar with the idea of having a "Show All" page so as to avoid pagination (we added in our pagination parameters into Google Webmaster Tools instead). Would you recommend using canonical to roll up results pages to a category and parent refinement level, and if so how many refinements would you recommend before drawing the line?
Thank you again,
David
-
The only differentiation (if there is any) you can make when it comes up to DC is between partial and "normal" DC... keep in mind that any type (!!!) of DC won`t do your site any good! Avoid DC whenever and wherever you can! Under all circumstances... I do not know Endeca but dealing with DC caused by a navigational structure is a serious problem, especially within a shop system.
There are differnt ways to fight DC or to confine it... most common is rel=prev/next or rel=canonical... these are alternatives and never perfect solutions but there are lots of scenarios where this is a big help.
I would be careful with follow and nofollow... if you let the robot follow everything this might lead to lots of errors in the scenario you describe but on the other hand setting many URLs to nofollow can also harm your site because it`s not a very trustworthy signal for Google
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content created by website Calendar - A Penalty?
A colleague of mine asked me a question about duplicate content coming from their event calendar. I don't think this will affect them negatively, but I would love some feedback and thoughts. ThanksOne of my clients, LifeTech Academy, is using my RavenTools software. Raventools has reported a HUGE amount of duplicate content (4.4K instances).The duplicate content all revolves around their calendar and repeating events (http://lifetechacademy.org/events/)The question is this - will this impact their SEO efforts in a negative way?
Intermediate & Advanced SEO | | Bill_K0 -
Concerns of Duplicative Content on Purchased Site
Recently I purchased a site of 50+ DA (oldsite.com) that had been offline/404 for 9-12 months from the previous owner. The purchase included the domain and the content previously hosted on the domain. The backlink profile is 100% contextual and pristine. Upon purchasing the domain, I did the following: Rehosted the old site and content that had been down for 9-12 months on oldsite.com Allowed a week or two for indexation on oldsite.com Hosted the old content on my newsite.com and then performed 100+ contextual 301 redirects from the oldsite.com to newsite.com using direct and wild card htaccess rules Issued a Press Release declaring the acquisition of oldsite.com for newsite.com Performed a site "Change of Name" in Google from oldsite.com to newsite.com Performed a site "Site Move" in Bing/Yahoo from oldsite.com to newsite.com It's been close to a month and while organic traffic is growing gradually, it's not what I would expect from a domain with 700+ referring contextual domains. My current concern is around original attribution of content on oldsite.com shifting to scraper sites during the year or so that it was offline. For Example: Oldsite.com has full attribution prior to going offline Scraper sites scan site and repost content elsewhere (effort unsuccessful at time because google know original attribution) Oldsite.com goes offline Scraper sites continue hosting content Google loses consumer facing cache from oldsite.com (and potentially loses original attribution of content) Google reassigns original attribution to a scraper site Oldsite.com is hosted again and Google no longer remembers it's original attribution and thinks content is stolen Google then silently punished Oldsite.com and Newsite.com (which it is redirected to) QUESTIONS Does this sequence have any merit? Does Google keep track of original attribution after the content ceases to exist in Google's search cache? Are there any tools or ways to tell if you're being punished for content being posted else on the web even if you originally had attribution? Unrelated: Are there any other steps that are recommend for a Change of site as described above.
Intermediate & Advanced SEO | | PetSite0 -
What is considered duplicate content?
Hi, We are working on a product page for bespoke camper vans: http://www.broadlane.co.uk/campervans/vw-campers/bespoke-campers . At the moment there is only one page but we are planning add similar pages for other brands of camper vans. Each page will receive its specifically targeted content however the 'Model choice' cart at the bottom (giving you the choice to select the internal structure of the van) will remain the same across all pages. Will this be considered as duplicate content? And if this is a case, what would be the ideal solution to limit penalty risk: A rel canonical tag seems wrong for this, as there is no original item as such. Would an iFrame around the 'model choice' enable us to isolate the content from being indexed at the same time than the page? Thanks, Celine
Intermediate & Advanced SEO | | A_Q0 -
Is This Considered Duplicate Content?
My site has entered SEO hell and I am not sure how to fix it. Up until 18 months ago I had tremendous success on Google and Bing and now my website appears below my Facebook page for the term "Direct Mail Raleigh." What makes it even more frustrating is my competitors have done no SEO and they are dominating this keyword. I thought that the issue was due to harmful inbound links and two months ago I disavowed ones that were clearly spam. Somehow my site has actually gone down! I have a blog that I have updated infrequently and I do not know if it I am getting punished for duplicate content. On Google Webmaster Tools it says I have 279 crawled and indexed pages. Yesterday when I ran the MOZ crawl check I was amazed to find 1150 different webpages on my site. Despite the fact that it does not appear on the webmaster tools I have three different webpages due to the format that the Wordpress blog was created: "http://www.marketplace-solutions.com/report/part2leadershi/", "http://www.marketplace-solutions.com/report/page/91/" and "http://www.marketplace-solutions.com/report/category/competent-leadership/page/3/" What does not make sense to me is why Google only indexed 279 webpages AND why MOZ did not identify these three webpages as duplicate content with the Crawl Test Tool. Does anyone have any ideas? Would it be as easy as creating a massive robot.txt file and just putting 2 of the 3 URLs in that file? Thank you for your help.
Intermediate & Advanced SEO | | DR700950 -
E-commerce store, in need of protecting our own content
Dear other Moz fans, We have an E-commerce store in Norway. Our main conversion to sale still happens in our physical store, but do to the description and information we provide online.
Intermediate & Advanced SEO | | Monica_Flirt
To warn you before you click; Our store is a boutique for "erotic items". A nice one how ever, made buy woman for woman and their man. We use enormous time writing descriptions and information for (almost) every item online.
We really want to protect our content (text information). What is the best practice to mark up "protection" of our hard work content? Thank you for your time.
Regards form the Flirt girls in Norway.0 -
Duplicate content for swatches
My site is showing a lot of duplicate content on SEOmoz. I have discovered it is because the site has a lot of swatches (colors for laminate) within iframes. Those iframes have all the same content except for the actual swatch image and the title of the swatch. For example, these are two of the links that are showing up with duplicate content: http://www.formica.com/en/home/dna.aspx?color=3691&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= http://www.formica.com/en/home/dna.aspx?color=204&std=1&prl=PRL_LAMINATE&mc=0&sp=0&ots=&fns=&grs= I do want each individual swatch to show up in search results and they currently are if you search for the exact swatch name. Is the fact that they all have duplicate content affecting my individual rankings and my domain authority? What can I do about it? I can't really afford to put unique content on each swatch page so is there another way to get around it? Thanks!
Intermediate & Advanced SEO | | AlightAnalytics0 -
Image optimization for e-commerce
Regarding image optimization for an ecommerce site.
Intermediate & Advanced SEO | | triplelootz
In your "category" pages you list your products with a small thumbnails / miniature image. When the user clicks on the product name or on the thumnails, he lands on the product page with the real size product image. How do you optimize the thumbnail image? Do you use a different ALT? Is Google smart enough to index the real size image? On one hand the image located on the "product" page has lot more content around, is bigger & more interesting for both the user and Google. On the other hand the "category" page has more autority ( links) than the product page... To reformulate my questions: Do you think ALT tag is important for your thumbnail image on your category pages. Do you write different ALT tag for your thumbnail image ( on your category pages) & and your real size image (on your product page)? Which ALT tag / image do you think is the most interesting for Google? What do you think? Cheers, Ludo0 -
BEING PROACTIVE ABOUT CONTENT DUPLICATION...
So we all know that duplicate content is bad for SEO. I was just thinking... Whenever I post new content to a blog, website page etc...there should be something I should be able to do to tell Google (in fact all search engines) that I just created and posted this content to the web... that I am the original source .... so if anyone else copies it they get penalised and not me... Would appreciate your answers... 🙂 regards,
Intermediate & Advanced SEO | | TopGearMedia0