How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate URL's in Sitemap? Is that a problem?
I submitted a sitemap to on Search Console - but noticed that there are duplicate URLs, is that a problem for Google?
On-Page Optimization | | Luciana_BAH0 -
Value of URL Changes
Hi Guys, I have a question. Each product listed on my webstie has product number like /product.php?id=3624. After I spent many hours with MOZ, I figured out that this approach is wrong and I should use the product name as URL to achieve better SEO performance. Now I am planing to change the URL generating algoritm but should I do it for existing products. Some of them have already been linked to external websites. I am thinking to create mirror URLs but this may cause rather damage on my website. Do you know what is the right answer? Best, Tony
On-Page Optimization | | Threeding.com0 -
Errors in URL´s
SEOMOZ is showing quite a lot of URL Errors like this: http://trampoliny.net.pl/akcesoria/pokrowiec-basic?frontend=1825cb1eea3af8ee6ee2d96617d32ff6 All these URL´s use the parameter "?frontend=". In webmaster tools we told google not to index this parameter. Unfortunately at the moment we cannot set this parameter as "NOINDEX". We also dont want to use a robots.txt file. How to get rid of the URLS in Seomoz?
On-Page Optimization | | drgoodcat0 -
Question about URLs
Hello! I have a client that wants to upload an URL like this: www.example.com/keyword/page-name.html The main problem is that www.example.com/keyword/ doesn't exist and gives a 404 error so I'd prefer not doing that...... What do you think about this? And if the client wants to go ahead, is there any solution? A 301 to the final page would help? Thank you in advance!
On-Page Optimization | | Juandbbam0 -
URL Extensions (with or without??!!)
Hello, SEOers~ Today I have a question about URL extensions. Which one is more search engine friendly between URL with extensions and without extensions? e.g. URL with extension : www.example.com/tv/lcd.jsp URL without extension : www.example.com/tv/lcd I heard that URL without extensions is in trend considering user experience. User experience is also important but I would like to know from SEO perspective. Please people~ Help me out with this~! Thanks.
On-Page Optimization | | Artience0 -
How do I get this program to see url with www. and with out www the same
The program is showing pages with www. as a differant page from a page with out the www. first, this is showing up as duplicate pages when they are the same page, how do I filter this?
On-Page Optimization | | masterplumbertom0 -
Ecommerce Product Subcategory URL
Our website has 5 main categories displayed in tabs in the header. The main landing page of each of the 5 categories is a paginated page (3pages- set up with canonical tags to avoid duplicate content) with a side bar which splits the main category into many subcategories. Each of these subcategories essentially filter the main landing page into more defined categories customers find useful (price/colour) BUT once clicked enter into a separate landing page. We have worked hard to avoid any duplicate content issues between these sub-landing pages and the main landing page. This was done as we wanted each of the subpages to organically rank (thus we went with this method rather than filters). Hope we didn't do the wrong thing there? The question is should these sub-landing pages route straight from home to have the best chance to get individually ranked or routed through the main category bearing in mind we have 5 main categories each with many subcategories. i.e. domain.co.uk/subcategory or domain.co.uk/category/subcategory Thanks in advance for any advice given.
On-Page Optimization | | jannkuzel0 -
Page URL Hiearchy
So I have read on here that page URL Hiearchy is important. My question is from a search engine standpoint which of the following methods would be the best to use (or another if not listed) COMPACT and naturally hierarchical MountainBiking.com MountainBiking.com/adventures ( a list of the pages below ) MountainBiking.com/adventures/in whistler (for each page) MountainBiking.com/adventures/in utah OR VERBOSE but reptetive MountainBiking.com MountainBiking.com/Mountain Biking adventures ( intro + a list of the pages below ) MountainBiking.com/Mountain Biking Adventures/Mounting Biking adventures in whistler MountainBiking.com/Mountain Biking Adventures/Mountain Biking Adventures in Utah It seemed like the blog I read suggested the compact form, but it seems to me that the verbose (though admittedly a bit clunky) seems better so far as exact keyword match etc. Experience and or advice on this?
On-Page Optimization | | bThere0