How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why the url inspection is disabled in search console ?
In this situation, how can we make our pages be fetched by google?
On-Page Optimization | | supporthandle0 -
Product URL
Hey Mozzers, Nice quick and simple one for you. Which of these 2 options is better for SEO and userbility and why domain.co.uk/productname.html
On-Page Optimization | | ATP
domain.co.uk/shop/category/product.html The top one stops stops any funky problems with magento making 2 paths to the same product but the second option feels more natural and helpful to the user. I feel both a valid but I would like some opinions please0 -
Robot.txt file issue on wordpress site.
I m facing the issue with robot.txt file on my blog. Two weeks ago i done some development work on my blog. I just added few pages in robot file. Now my complete site seems to be blocked. I have checked and update the file and still having issue. The search result shows that "A description for this result is not available because of this site's robots.txt – learn more." Any suggestion to over come with this issue
On-Page Optimization | | Mustansar0 -
Improving the search function on my site...
Hi all, The search function on my site is pretty bad... it basically lists every single product for any search query. Has anyone got experience integrating a 3rd party tool such as Google Custom Search or Pardot and if so which would you recommend? Alternatively, any tips on improving/ creating rules for site search would be appreciated. https://www.boardwarehouse.co.uk/ Thanks, Alick
On-Page Optimization | | Alick3000 -
Categories and URL Structure - When to add a new directory?
I've been wondering this for quite awhile so I figured I should just ask. Suppose my website has 5 categories and the url structure looks like: www.mysite.com/category1/ www.mysite.com/category2/ do I also want to create a landing page for the above categories at the same URL depth as the homepage of the site? www.mysite.com/category1.html OR what about: www.mysite.com/category1/index.html Which is a better way to do this? Also, if your site began as fairly small and your 5 categories were your only other pages other than index, about, and contact pages (meaning you really had no reason to create separate directories), then as time passes, you decide to add 3 subcategory pages that would fit into a page: www.mysite.com/category1.html would you add a folder with he same name as the html page, and then rename the html file as index.html and place it into the new folder?
On-Page Optimization | | SEO-Pump.com0 -
Removing old URLs that are being used for my on page optimization?
Is there a way to remove old URL's that are still being used for my keywords for my on page optimization? They are giving me grades of F since they no longer exist and if I change the URL to the current one, the grade becomes an A, but they are still showing after the new crawl.
On-Page Optimization | | Dirty0 -
Hierarchy and consistency in ecommerce URLs
One of the first things I remember reading about SEO and URLs, a long time ago, is that keywords are important, and hierarchy is important, for search engines and for users. Hierarchy in URLs would give the search engines an idea of the structure of the site, and users would be able to edit the URLs to continue navigating. I'm wondering about URLs, hierarchy and usability lately, since I've seen that ASOS uses a new URL structure on their site. At first glance, I thought it was brilliant, so I would like to get all of your opinions as well. For those of you that haven't seen the URLs: for categories, ASOS uses a structure as you would expect it, but for products they don't insert the category in the URL. Instead they insert the brand name as the first part of the URL, followed by the product title. Some examples: Category:
On-Page Optimization | | DocdataCommerce
www.asos.com/women/dresses/... Product:
www.asos.com/french-connection/french-connection-tie-waist-pocket-stripe-dress/... I can see the importance of brand name for a site like ASOS, and like how they stressed this by inserting not the category but the brand for products. I don't know how much ASOS still relies on organic non-ASOS related keyword traffic, but still. Now, for hierarchy, I guess a good internal linking structure will tell the search engines about the hierarchy of a site as well, right? So perhaps hierarchy in the URL isn't that important? Perhaps something like this would be just as good as anything, given a good internal link structure? www.onlinestore.com/category/
www.onlinestore.com/subcategory/
www.onlinestore.com/brand/product-title/ Now, I understand that if you use this structure, you wouldn't be able to have men/shirts and women/shirts, but let's say that you don't have subcategories that use the same names. In this case, how important is hierarchy? And, what do you think about this URL structure for an ecommerce site for which brands are important?0