Techniques to fix eCommerce faceted navigation
-
Hi everyone,
I've read a lot about different techniques to fix duplicate content problems caused by eCommerce faceted navigation (e.g. redundant URL combinations of colors, sizes, etc.). From what I've seen suggested methods include using AJAX or JavaScript to make the links functional for users only and prevent bots from crawling through them.
I was wondering if this technique would work instead?
If we detect that the user is a robot, instead of displaying a link, we simply display its anchor text.
So what would be for a human
COLOR
< li > < a href = red >red < /a > < /li >
< li > < a href = blue>blue < /a > < /li >Would be for a robot
COLOR
< li > red < /li >
< li > blue < /li >Any reason I shouldn't do this?
Thanks!
*** edit
Another reason to fix this is crawl budget since robots can waste their time going through every possible combination of facet. This is also something I'm looking to fix.
-
I share Alan's hesitation - it could look like cloaking, especially if a bot is making the call. If the pages aren't indexed yet, you could just "nofollow" the links - it sends the same signal transparently.
Home Depot is probably pulling it off with the AJAX/JS implementation, which is a bit harder for Google to parse. They also have a massive authority and link profile, so they can always squeak the small stuff by. You might not be so lucky. In general, it's best to stick to the standard practices and not get too tricky.
-
I've been browsing sites looking at what the big players are doing
Homedepot.com seems to be doing exactly this; if you go to
And you click a facet to narrow the result, the page is refreshed via AJAX
If you go to the same page with a Googlebot user agent, even with JavaScript enabled, clicking the checkbox does nothing!
Is this cloaking? Why is this legit?
-
But is it really cloaking? We wouldn't be showing different content. Just disabling links. This article describes a technique that's more akin to cloaking and justifies it because of "intent": http://www.seomoz.org/ugc/dealing-with-faceted-navigation-a-case-study.
The problem with canonical is that the robots will still waste crawl budget going through all the combinations of facets we have. We have hundreds of categories with complex products with 10+ facets with 10+ options each...
-
That would be cloaking, best not do that
A canonical tag would be best, thats what they are for
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ecommerce Tabs
This isn't a unique problem but an e-commerce client has product information on a page, with separate tabs that have been historically loaded with a new page, which have been indexed. Product (/product): 8,450 Results Content1 (/product?tab=content1): 966 results Content2 (/product?tab=content2): 683 Results Content3 (/product?tab=content3): 1,750 Results Content4 (/product?tab=content4): 1,500 Results All of the content shares a common product top section (summary of information) but has unique canonical url definitions, meta information, etc. The individual content tabs are all part of a larger grouping, which is why their index level is considerably less than the actual product page. As the client grows and updates this historical practice, one of the implementation options is making the content available on the page via an Ajax load. The desire would be to maintain the ability to search for content1, content2, etc at that level and not spread the juice throughout all the main product pages. My question is what would the best setup be to maintain the historical ability to target the content individually via Search, while updating the UI/UX for a better customer experience? If the ajax route is the way to go, what are all the tasks necessary to properly handle without creating a separate duplicate pathing? Some of the tasks that I've outlined would be Using pushState to update the url when the tab is changed Is there an ability to also update canonicals & meta information? what else am I missing? Any guidance would be great as Id love to get some thoguhts on the matter. Thanks!
Intermediate & Advanced SEO | | RosemarieReed0 -
How to switch from URL based navigation to Ajax, 1000's of URLs gone
Hi everyone, We have thousands of urls generated by numerous products filters on our ecommerce site, eg./category1/category11/brand/color-red/size-xl+xxl/price-cheap/in-stock/. We are thinking of moving these filters to ajax in order to offer a better user experience and get rid of these useless urls. In your opinion, what is the best way to deal with this huge move ? leave the existing URLs respond as before : as they will disappear from our sitemap (they won't be linked anymore), I imagine robots will someday consider them as obsolete ? redirect permanent (301) to the closest existing url mark them as gone (4xx) I'd vote for option 2. Bots will suddenly see thousands of 301, but this is reflecting what is really happening, right ? Do you think this could result in some penalty ? Thank you very much for your help. Jeremy
Intermediate & Advanced SEO | | JeremyICC0 -
Mega Menu Navigation Best Practice
First off, I'm a landscape/nature/travel photographer. I mainly sell prints of my work. I'm in the process of redesigning my website, and I'm trying to decide whether to keep the navigation extremely simple or leave the drop-down menu for galleries. Currently, my navigation is something like this: Galleries
Intermediate & Advanced SEO | | shannmg1
> Gallery for State or Country (example: California)
> Sub-region in State or Country (example: San Francisco)
Blog
Prints
About
Contact Selling prints is the top priority of the website, as that's what runs the business. I have lots of blog content, and I'm starting to build some good travel advice, etc. but in reality, the galleries, which then filter down to individual pages for each photo with a cart system, are the most important. What I'm struggling to decide is whether to leave the sort of "mega menu" for the galleries, or to do away with them, and have the user go to the overall galleries page to navigate further into the site. Leaving the mega menu intact, the galleries page becomes a lot less important, and takes out a step to get to the shopping cart. However, I'm wondering if the amount of galleries in the drop down menu is giving TOO many choices up front as well. I also wonder how changing this will affect search. Any thoughts on which is better or is it really just a matter of preference?0 -
Can spiders crawl javascript navigation now?
I was reading Danny Dover's book and decided to try some websites and so far everyone I have looked at has had navigation that does not work with disabled javascript. Is this still as important as it was at the time of publish (2011)? Thanks!
Intermediate & Advanced SEO | | Sika220 -
Penalized because of Pharma Wordpress Hack, Fixed, When can we expect to get out?
Hey Guys, so one of our clients hired a web designers to re do his site. Unfortunately in the process the client got a nasty pharma hack and we had to completely re do his site in drupal by scratch because it was so difficult to remove the hack. In this process his lost all his rankings, sub 100 and the hack produced super low quality links from drug related sites pointing to his pages. We're 100% certain the hack is gone, we've disavowed every link, and used WMT to deindex all the drug pages the hack had created. Still 2 weeks later he is sub 100. Does anyone else know of any way to push this along faster? I wish there was some way to get Google to recognize its fixed faster as his business is destroyed.
Intermediate & Advanced SEO | | iAnalyst.com0 -
How do I fix item warnings? Invalid URL in attribute: image_link
Today, I have submitted Test Data Feed on Google merchant center for one Ecommerce website. (http://www.techcart.co.uk) But, I can see item warnings for invalide URL in attribute: image_link. You can find out attachment to know more about it. My website contain image path as follow. http://images.icecat.biz/img/norm/high/101859-8677.jpg I have read detail guidelines for Google merchant center and come to know as follow. Product feed must contain image link with domain name like follow. http://images.vistastores.com/product/large/63241/Professional-Air-Grid-Managers-Office-Chair-with-Black-Mesh-Seat.jpg But, I have found approved merchants and products in Google shopping which does not contain image path with domain name. You can check following path to know more about it. http://common1.csnimages.com/lf/50/hash/19276/6218082/1/Tozai-Sun-Mirror-on-Pedestal.jpg https://www.google.com/shopping/product/14637277284367613483 So, How do I fix item warnings regarding Invalid URL in attribute: image_link? I need urgent help on this issue. Image_Link_Issue_Test_Data_Feed_TechCart.png?view=1∂=4
Intermediate & Advanced SEO | | CommercePundit0 -
Ecommerce: Add new range to current website or Create a new one
Hello, I run a website called easywatering.co.uk. We rank well for keywords in our industry in the UK selling Garden Irrigation. This is very much a summer business. My directors are saying that they want to start selling winter products (that have no relevancy for the keywords we target). They also want to keep costs down by using easywatering.co.uk to sell this new product line. If we do start targeting for new keywords that are out of our current industry, do you think that we would dilute the current content and therefore reduce our relevancy for the keywords? Or do you think that we would be fine to start targeting for new keywords in a different industry sector?
Intermediate & Advanced SEO | | SeanLade0 -
Help, really struggling with fixing mistakes post-Penguin
We had previously implemented a strategy of paying for lots of links and focusing on 3 or 4 keywords as our anchors, which used to REALLY work (I know, I know, bad black hat strategy - I have since learned my lesson). These keywords and others have since plummeted up to 100 spots since Panda 3.3 and Penguin. So I'm trying to go in and fix all our mistakes cuz our domain is too valuable to us just to start over from scratch. Yesterday I literally printed a 75 page document of all of our links according to Open Site Explorer. I have been going in and manually changing anchor text wherever I can, and taking down the very egregious links if possible.This has involved calling and emailing webmasters, digging up old accounts and passwords, and otherwise just trying to diversify our anchor text and remove bad links. I've also gone into our site and edited some internal links (also too weighty on certain keywords) and removed other links entirely. My rankings have gone DOWN more today. A lot. WTF does Google want? Is there something I'm doing wrong? Should we be deleted links from all private networks entirely or just trying to vary the anchor text? Any advice greatly appreciated. Thanks!
Intermediate & Advanced SEO | | LilyRay0