Best way to handle page filters and sorts
-
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot.
I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL.
Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page.
What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice:
- put rel canonical tag on all of the pages with parameters and point to "root"
- use the google parameter tool and have it not crawl any urls with my parameters
- put meta no robots on the parameter pages
Thanks!
-
The only thing I might add is that, depending on the business, it might be worth building a "Red Widgets" category (as an example). However, you would treat this like a sub-category and write its own category description. You would give it its own rel canonical tag, treating it as the root of the "Red Widgets" category root.
Nine times out of ten it isn't necessary to give sorting and filtering options their own category page though, and a rel canonical tag to the canonical version of that page is the second best option. The first best option would be to not change the URL at all, only re-order the items, hiding some and featuring others. Most eCommerce platforms don't have this functionality at present, however. Rel Canonical was made to span the gap until they do.
-
I'd definitely go with option 1 - to canonicalise all the parameter variations to the root page. This is a textbook example of what the canonical meta-tag is designed for.
In addition, because you say that many of the variations are also ranking, this will pass that ranking to the root page, instead of throwing it away as would happen if you used the GWT to ignore the parameters.
Lastly, the canonical will be understood by most engines and only needs implementing once. If you go the GWT route, you'll also have to do it manually in Bing Webmaster Tools as well, and then you'll have to remember to update both each time new parameters are implemented. And this still won't work for secondary search engines, assuming they have any importance to your site.
I always think of the Webmaster Tools solution as the method of last resort if for some technical reason I am unable to implement correct canonicalisation/redirects. Consistency and lack of manual intervention are paramount for me in these situations.
Hope that helps?
Paul
-
I'd go with the parameter option:
- Go to Webmaster tools > Crawl > URL Parameters > Configure URL Parameters and enter all of the sorting/filtering parameters there.
2A) If all of your items are on one page, you can set up a canonical URL for that page (which would ignore all sorting parameters)
2B) If your categories have multiple pages, be sure to use rel=next/prev for pagination
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should my back links go to home page or internal pages
Right now we rank on page 2 for many KWs, so should i now focus my attention on getting links to my home page to build domain authority or continue to direct links to the internal pages for specific KWs? I am about to write some articles for several good ranking sites and want to know whether to link my company name (same as domain name) or KW to the home page or use individual KWs to the internal pages - I am only allowed one link per article to my site. Thanks Ash
Intermediate & Advanced SEO | | AshShep10 -
What's the best way to redirect categories & paginated pages on a blog?
I'm currently re-doing my blog and have a few categories that I'm getting rid of for housecleaning purposes and crawl efficiency. Each of these categories has many pages (some have hundreds). The new blog will also not have new relevant categories to redirect them to (1 or 2 may work). So what is the best place to properly redirect these pages to? And how do I handle the paginated URLs? The only logical place I can think of would be to redirect them to the homepage of the blog, but since there are so many pages, I don't know if that's the best idea. Does anybody have any thoughts?
Intermediate & Advanced SEO | | kking41200 -
Merge content pages together to get one deep high quality content page - good or not !?
Hi, I manage the SEO of a brand poker website that provide ongoing very good content around specific poker tournaments, but all this content is split into dozens of pages in different sections of the website (blog section, news sections, tournament section, promotion section). It seems like today having one deep piece of content in one page has better chance to get mention / social signals / links and therefore get a higher authority / ranking / traffic than if this content was split into dozens of pages. But the poker website I work for and also many other website do generate naturally good content targeting long tail keywords around a specific topic into different section of the website on an ongoing basis. Do you we need once a while to merge those content pages into one page ? If yes, what technical implementation would you advice ? (copy and readjust/restructure all content into one page + 301 the URL into one). Thanks Jeremy
Intermediate & Advanced SEO | | Tit0 -
Best way to remove duplicate content with categories?
I have duplicate content for all of the products I sell on my website due to categories and subcategories. Ex: http://www.shopgearinc.com/products/product/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/stockfeeder-af38.php http://www.shopgearinc.com/products/co-matic-power-feeders/heavy-duty-feeders/stockfeeder-af38.php Above are 3 urls to the same title and content. I use a third party developer backend system so doing canonicalization seems difficult as I don't have full access. What is the best to get rid of this duplicate content. Can I do it through webmaster tools or should I pay the developer to do the canonicalization or a 301 redirect? Any suggestions? Thanks
Intermediate & Advanced SEO | | kysizzle60 -
Will Creating a Keyword specific Page to replace the Category Section page cause any harm to my website?
I am running a word press install for my blog and recently had 3 of my main keywords set as categories. I recently decided to create a static page for the keywords instead of having the category page showing all the posts within the category, and took it off the navigation bar. I read about setting the categories to use NO index so the search engines can shine more importance on the new pages i created to really replace where the category was showing. Can this have a negative effect on my rankings? http://junkcarsforcashnjcompany.com junk car removal nj is showing the category section, So i placed the no index on it. Will the search engines refresh the data and replace it with the new page I created?
Intermediate & Advanced SEO | | junkcars0 -
Best practice for removing indexed internal search pages from Google?
Hi Mozzers I know that it’s best practice to block Google from indexing internal search pages, but what’s best practice when “the damage is done”? I have a project where a substantial part of our visitors and income lands on an internal search page, because Google has indexed them (about 3 %). I would like to block Google from indexing the search pages via the meta noindex,follow tag because: Google Guidelines: “Use robots.txt to prevent crawling of search results pages or other auto-generated pages that don't add much value for users coming from search engines.” http://support.google.com/webmasters/bin/answer.py?hl=en&answer=35769 Bad user experience The search pages are (probably) stealing rankings from our real landing pages Webmaster Notification: “Googlebot found an extremely high number of URLs on your site” with links to our internal search results I want to use the meta tag to keep the link juice flowing. Do you recommend using the robots.txt instead? If yes, why? Should we just go dark on the internal search pages, or how shall we proceed with blocking them? I’m looking forward to your answer! Edit: Google have currently indexed several million of our internal search pages.
Intermediate & Advanced SEO | | HrThomsen0 -
Will pages irrelevant to a site's core content dilute SEO value of core pages?
We have a website with around 40 product pages. We also have around 300 pages with individual ingredients used for the products and on top of that we have some 400 pages of individual retailers which stock the products. Ingredient pages have same basic short info about the ingredients and the retail pages just have the retailer name, adress and content details. Question is, should I add noindex to all the ingredient and or retailer pages so that the focus is entirely on the product pages? Thanks for you help!
Intermediate & Advanced SEO | | ArchMedia0 -
Best way to host multiple sites for maximum seo
We have over 100 websites we built for clients that we currently host on 1 shared godaddy hosting account. They each have a link to us but since they are all under one shared account, we feel that we are not maximizing the inbound link potential. I've looked into c class hosting but found that either the ip's were flagged as spam, or they shared nameservers which defeats the purpose. I've also been told that since the c class ip's a hosting company gives to you are all owned by them, that also defeats the purpose. Anyone have any solutions besides opening 130 accounts with different hosting companies? Also, will it make any difference changing existing sites onto different hosts now or are they already tainted?
Intermediate & Advanced SEO | | seopet0