How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to best handle search landing pages - that don't exist
I have quite a bit of blog information that can be searched, which results in "pages" that don't actually live anywhere. These are scanned by Moz and appear as poor page quality for speed, etc. How do I get the service to either ignore all of these or is there a way to treat them as a real page with content? As there are quite a few generated over time, I'd like to be able to capture them somehow. Thanks.
On-Page Optimization | | amac70 -
Changing Url structure to incorporate Woo Commerce
Advise needed please We have rankings coming along nicely with a website that uses page content but we now need to start online shopping with woo commerce The url structure has always been a bit of a mess, but its quite in depth We are looking to move small paragraphs about each product cat (formerly put on Pages) information into the Product Category and then the Product information into the product page and redirect the old urls to the new urls. It would mean updating the permalinks also - My concern if there is less leverage with product categories - do these rank just as well as pages, are we going to see our rankings change dramatically in doing so? Added to that - is it best doing this change gradually or all at once (like staging site to get the set up ready) and then pushing live
On-Page Optimization | | KellyDSD860 -
Frustrated by Google Search Result
We have a page on our website for our review of the "Voltage" shisha flavor by "Social Smoke" (Social Smoke is the brand). Voltage is one of their hookah tobacco flavors. https://www.hookah.org/social-smoke-voltage-flavored-hookh-tobacco/. When I search for "Social Smoke Voltage Review", our page is at the bottom of the first page result. We have a video, decent content on the page, and a review function. We've implemented correct Schema code too: https://goo.gl/iwCP7E. When I use the page grade tool on Moz. Our page ranks B for that keyword but the results number 1 and 2 and 3 on Google all rank C or D. Our video and review schemas don't show up on Google search result either. We have a good community online. Our social media pages are popular. We share the blog posts on the social media accounts fairly regularly too. We have an old and established website. From what I understand we are following all of Google's standards and rules too. What does a website owner gotta do?
On-Page Optimization | | Heydarian0 -
Should I use an acronym in my URL?
I know that Google understands various acronyms. Example: If I search for CRM System, it knows i'm searching for a customer relationship management system. However, will it recognize less known acronyms? I have a page geared specifically for SAP data archiving for human capital management systems. For those in the industry, they simply call it HCM. Here is how I view my options: Option #1: www.mywebsite.com/sap-data-archiving/human-capital-management Option #2: www.mywebsite.com/sap-data-archiving/hcm Option #3: www.mywebsite.com/sap-data-archiving/hcm-human-capital-management With option #3, i'm capturing the acronym AND the full phrase. This doesn't make my URL overly long either. Of course, in my content i'll reference both. What does everyone else think about the URL? -Alex
On-Page Optimization | | MeasureEverything0 -
Paginated URLs are getting Indexed
Hi, For ex: - My site is www.abc.com and Its paginated URLs for www.abc.com/jobs-in-delhi are in the format of : www.abc.com/jobs-in-delhi-1, www.abc.com/jobs-in-delhi-2 and vice versa also i have used pagination tags rel=next and rel=prev. My concern is all the paginated URLs are getting indexed so is their any disadvantage if these URLs are getting indexed as somewhere i have read that link juice may get distributed in case of pagination. isn't it good to use Noindex, Follow so that we can make the Google to understand that paginated page are not so much important and that should not be ranked.
On-Page Optimization | | vivekrathore0 -
ECommerce URL's
This is based on a clothing retailer, eCommerce site. In an effort to reduce the length of our product names, we are considering removing terms like long-sleeve, short-sleeve, etc., but leaving that information in the URL. Now, the concern is that we would lose some traction in the SERP's if those descriptive words are left out as the product name is also our page title. Then I think keywords as broad as long-sleeve shirt wouldn't serve us well anyways. One idea we have is that the alt tag on the product image could still display the longer product name that would include long-sleeve, etc. thus having the keyword on the product page. Any ideas or suggestions? Hope this is clear. Seems redundant from a user standpoint to state long-sleeve, etc. in every product name. Thanks - your answers are always so helpful!
On-Page Optimization | | kennyrowe0 -
Navigation for search
We are getting ready to launch a site that has great navigation for users, but it is not so great for search engines. As long as we are ethical about it, does anyone see a downside to detecting a bot user agent and displaying different nav to it? I suppose some could consider it cloaking, but I noticed amazon uses this strategy and they don't seem to be getting a big penalty lol. We are not going to do anything shady with it, just offer the bot a different way to access our content. Any thoughts?
On-Page Optimization | | altecdesign0 -
Google indexing Internal Search Results
Greeting, Currently I have noticed that Google is starting to index our internal search page results. Should I block those pages in our robot txt file or have you ever heard of any websites that actually gained traffic or rank by letting Google index those pages? Thanks
On-Page Optimization | | Tonyd230