Best way to handle page filters and sorts
-
Hello Mozzers, I have a question that has to do with the best way to handle filters and sorts with Googlebot.
I have a page that returns a list of widgets. I have a "root" page about widgets and then filter and sort functionality that shows basically the same content but adds parameters to the URL. For example, if you filter the page of 10 widgets by color, the page returns 3 red widgets on the top, and 7 non-red widgets on the bottom. If you sort by size, the page shows the same 10 widgets sorted by size. We use traditional php url parameters to pass filters and sorts, so obviously google views this as a separate URL.
Right now we really don't do anything special in Google, but I have noticed in the SERPs sometimes if I search for "Widgets" my "Widgets" and "Widgets - Blue" both rank close to each other, which tells me Google basically (rightly) thinks these are all just pages about Widgets. Ideally though I'd just want to rank for my "Widgets" root page.
What is the best way to structure this setup for googlebot? I think it's maybe one or many of the following, but I'd love any advice:
- put rel canonical tag on all of the pages with parameters and point to "root"
- use the google parameter tool and have it not crawl any urls with my parameters
- put meta no robots on the parameter pages
Thanks!
-
The only thing I might add is that, depending on the business, it might be worth building a "Red Widgets" category (as an example). However, you would treat this like a sub-category and write its own category description. You would give it its own rel canonical tag, treating it as the root of the "Red Widgets" category root.
Nine times out of ten it isn't necessary to give sorting and filtering options their own category page though, and a rel canonical tag to the canonical version of that page is the second best option. The first best option would be to not change the URL at all, only re-order the items, hiding some and featuring others. Most eCommerce platforms don't have this functionality at present, however. Rel Canonical was made to span the gap until they do.
-
I'd definitely go with option 1 - to canonicalise all the parameter variations to the root page. This is a textbook example of what the canonical meta-tag is designed for.
In addition, because you say that many of the variations are also ranking, this will pass that ranking to the root page, instead of throwing it away as would happen if you used the GWT to ignore the parameters.
Lastly, the canonical will be understood by most engines and only needs implementing once. If you go the GWT route, you'll also have to do it manually in Bing Webmaster Tools as well, and then you'll have to remember to update both each time new parameters are implemented. And this still won't work for secondary search engines, assuming they have any importance to your site.
I always think of the Webmaster Tools solution as the method of last resort if for some technical reason I am unable to implement correct canonicalisation/redirects. Consistency and lack of manual intervention are paramount for me in these situations.
Hope that helps?
Paul
-
I'd go with the parameter option:
- Go to Webmaster tools > Crawl > URL Parameters > Configure URL Parameters and enter all of the sorting/filtering parameters there.
2A) If all of your items are on one page, you can set up a canonical URL for that page (which would ignore all sorting parameters)
2B) If your categories have multiple pages, be sure to use rel=next/prev for pagination
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to use redirects on a massive site consolidation
We are migrating 13 websites into a single new domain and with that we have certain pages that will be terminated or moved to a new folder path so we need custom 301 redirects built for these. However, we have a huge database of pages that will NOT be changing folder paths and it's way too many to write custom 301's for. One idea was to use domain forwarding or a wild card redirect so that all the pages would be redirected to their same folder path on the new URL. The problem this creates though is that we would then need to build the custom 301s for content that is moving to a new folder path, hence creating 2 redirects on these pages (one for the domain forwarding, and then a second for the custom 301 pointing to a new folder). Any ideas on a better solution to this?
Intermediate & Advanced SEO | | MJTrevens0 -
Best Practices for Title Tags for Product Listing Page
My industry is commercial real estate in New York City. Our site has 300 real estate listings. The format we have been using for Title Tags are below. This probably disastrous from an SEO perspective. Using number is a total waste space. A few questions:
Intermediate & Advanced SEO | | Kingalan1
-Should we set listing not no index if they are not content rich?
-If we do choose to index them, should we avoid titles listing Square Footage and dollar amounts?
-Since local SEO is critical, should the titles always list New York, NY or Manhattan, NY?
-I have red that titles should contain some form of branding. But our company name is Metro Manhattan Office Space. That would take up way too much space. Even "Metro Manhattan" is long. DO we need to use the title tag for branding or can we just focus on a brief description of page content incorporating one important phrase? Our site is: w w w . m e t r o - m a n h a t t a n . c o m <colgroup><col width="405"></colgroup>
| Turnkey Flatiron Tech Space | 2,850 SF $10,687/month | <colgroup><col width="405"></colgroup>
| Gallery, Office Rental | Midtown, W. 57 St | 4441SF $24055/month | <colgroup><col width="405"></colgroup>
| Open Plan Loft |Flatiron, Chelsea | 2414SF $12,874/month | <colgroup><col width="405"></colgroup>
| Tribeca Corner Loft | Varick Street | 2267SF $11,712/month | <colgroup><col width="405"></colgroup>
| 275 Madison, LAW, P7, 3,252SF, $65 - Manhattan, New York |0 -
How does Googlebot evaluate performance/page speed on Isomorphic/Single Page Applications?
I'm curious how Google evaluates pagespeed for SPAs. Initial payloads are inherently large (resulting in 5+ second load times), but subsequent requests are lightning fast, as these requests are handled by JS fetching data from the backend. Does Google evaluate pages on a URL-by-URL basis, looking at the initial payload (and "slow"-ish load time) for each? Or do they load the initial JS+HTML and then continue to crawl from there? Another way of putting it: is Googlebot essentially "refreshing" for each page and therefore associating each URL with a higher load time? Or will pages that are crawled after the initial payload benefit from the speedier load time? Any insight (or speculation) would be much appreciated.
Intermediate & Advanced SEO | | mothner1 -
Best way to show content from articles I am published/featured in
Hi. I was wondering what was the best way to show my audience articles that my client is featured in. My client is specifically a surgeon, who has been referenced in many articles around his specific field of cosmetic surgery. An idea posed is to repost the entire article but just reference back to the original article. Is there an SEO friendly way of doing this? I have seen this done before, like search engine journal's author Larry Kim might repost something he wrote or published on wordstream onto search engine journal sometimes, but makes the reference that it was originally posted on wordstream. I know the standard thinking is to always just write new and unique content, but there is already a good amount written about our client and referencing his work, how can we use this to our advantage and give new or prospecting patients information regarding his credibility? Our client really does not want us to write articles for him, and he does not have the time to write them either. Again Question: How can we leverage articles and studies that have already been published online that is featuring our client and show them in full onto our own website?
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Putting "noindex" on a page that's in an iframe... what will that mean for the parent page?
If I've got a page that is being called in an iframe, on my homepage, and I don't want that called page to be indexed.... so I put a noindex tag on the called page (but not on the homepage) what might that mean for the homepage? Nothing? Will Google, Bing, Yahoo, or anyone else, potentially see that as a noindex tag on my homepage?
Intermediate & Advanced SEO | | Philip-DiPatrizio0 -
How to rank product pages?
Hi guys, Please advice me on something improving my product pages ranking. We are doing well for head terms, categories but not ranking for product pages. We have issues with product pages which I am think is hard to tackle. For instance we have duplicate products (different colors), duplicate content internally (colors) and from manufacturer websites. Product pages linked from sub-category i.e. Home > Category > Sub-Category (20 per page) using pagination for next 20 and so on. Product pages linked internally via widgets that says other Similar products, featured products etc. Another issue with our product pages is that we are using third party reviews platform and whenever users add reviews to product pages this platform creates an hyperlink to different anchors which is not relevant to product. Example - http://goo.gl/NUG652 Can somebody please give some advice on how to improve rankings for product pages. writing unique content for thousands of pages is not possible. Even our competitor not writing unique content.
Intermediate & Advanced SEO | | Webmaster_SEO0 -
Best practice?
Hi there, I have recently written an article which I have posted on an online newspaper website. I want to use this article and put it on my blog also, the reason the article will be placed on my blog is to drive users from my email marketing activities. Would it simply be best practice to disallow Google from crawling this page? or put a rel canonical on the article placed on my blog pointing to the article placed on the online newspaper website? Thanks for any suggestions
Intermediate & Advanced SEO | | Paul780 -
Best way to de-index content from Google and not Bing?
We have a large quantity of URLs that we would like to de-index from Google (we are affected b Panda), but not Bing. What is the best way to go about doing this?
Intermediate & Advanced SEO | | nicole.healthline0