How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Query for paginated URLs - Shopify
Hi there, /collections/living-room-furniture?page=2
On-Page Optimization | | williamhuynh
/collections/living-room-furniture?page=3
/collections/living-room-furniture?page=4 Is that ok to make all the above paginated URLs canonicalised with their main category /collections/living-room-furniture Also, does it needs to be noindex, follow as well? Please advice, thank you!1 -
Google search result dramatically dropped with drop in DA.
It looks like on 11/13 by site traffic dropped by like 75% and it just happens to coincide with the MOZ DA dropping to. Anyone else see this?
On-Page Optimization | | Motom70 -
Should I be using the town or city in url with my keyword or keyphrase?
should I be using the town or city in url with my keyword or keyphrase? So lets say I'm trying to rank for butchers in home town should i put the town in the url as well so www.website.com/butchers-in-mytown is that bad? Or would it be best to just put www.website.com/butchers?
On-Page Optimization | | genkee0 -
USA appears in search snippet
Buongiorno from 14 degrees C very cloudy Wetherby UK, When I type "Dartex Coatings" in Google i get a search snippet with USA in the description. See illustration: http://i216.photobucket.com/albums/cc53/zymurgy_bucket/usa-indescription-2_zps42296e0a.jpg I cannot see where this is comming from on site http://www.dartexcoatings.com/ . HAving checked the souce code there is no reference to USA 😞 So my question is please..."where is Google picking up USA from"? Grazie,
On-Page Optimization | | Nightwing
David0 -
Making a subdomain more search engine friendly?
I have a volusion website, and I cannot have a subdirectory, it must be a subdomain. Is there anything I can do to optimize this subdomain to make it more search engine friendly?
On-Page Optimization | | SKMEIS0 -
Which is Best Practice for creating URLs for subdomain?
My website is related to education. We have created sub domains for all major colleges, universities & Entrance exams like Gre, Toefl ETC. for eg: amityuniversity.abc.com (Amity is Name of University ) Now if have to mention city name in URL as well (college is located in multiple locations) amityuniversity-delhi.abc.com
On-Page Optimization | | rohanarora536
amityuniversitydelhi.abc.com Now my Q is can we use hyphens in sub domains if we have to add city name or shall we create without using any hyphens. In Directory structure we can always separate words with hyphens, can we follow same practice in subdomain as well Which is a best URL for subdomain amity-university-delhi.abc.com
amityuniversity-delhi.abc.com
or amityuniversitydelhi.abc.com0 -
URL Rewrite
(By Google Traductor) Hello, I wanted to ask about some changes that we are evaluating for the issue of passing the url with variables to be more descriptive, for example: http://www.agroads.com.ar/detalle.asp?clasi=139592 tohttp://www.agroads.com.ar/humedimetro-para-cereales-draminski-gmm-139592.html In this case corresponds to the breakdown of a product if you have long published andcan be well positioned to change the title of this position would be lost unless youmanage it with a 301, as one would manage when you have more than 30000 products and title may change several times? There are tools to manage this? Finally, we must apply this to all listed with their respective filters, recommends doingtheir part with 301 redirects and analyze what funciene well to continue with the rest or implement a complete change? I hope I can bring a little light to implement this. Greetings and thanks! Roberto
On-Page Optimization | | romaro0 -
Two points of view on optimizing our search pages. What should we go with?
So we're in the process of going back and forth with our designer about optimizing our search results, which also doubles as a landing page for visitors searching with keywords like "Meeting Rooms Seattle" and "Seattle Meeting Spaces" We're on the front page in the SERPs, but still have a way to go. This is our current page: http://www.evenues.com/Meeting-Spaces/Seattle/Washington And this is something we've proposed for our designer to work with: http://imgur.com/JU1zg There search page text and links in the top left corner were to be placed for onsite SEO purposes ie we have no real text/content on the page for relevancy. We're currently in the process of writing the copy for each city on the search pages. Our designer made this argument: After giving it some thought I came to the conclusion that we may want to take a step back, and focus on the overall goal of this exercise. From what I have gathered, you would like to generate more click-throus and improve SEO, right? In my opinion, adding all of the provided copy and the link farm to the search results page would not necessarily help that. In fact, I think it would actually push the actual results way down. The content you provided me is more suited for a landing page, not a search results page (that is taking into consideration that you want similar content for other locations). Redfin has done a ton of great SEO work on their site. Using them as an example, if you go to Redfin.com, you will find tiny links in the footer that say "home for sale in seattle" etc. If you click on those, it puts you on a page like this: http://www.redfin.com/cities/1/seattle?src=homepage and then from there you can click to a neighborhood page like this: http://www.redfin.com/city/1387/WA/Bellevue. I would recommend that we create a set of location pages with the content the client is asking for, that are specifically optimized for SEO, and provide links in the footer of the site to get to those pages. Then the links on the new landing pages would land the user on the search results page. By keeping two different pages for two different purposes separate would help keep content more organized and help user find specific info they are looking for. As a quick fix we could put one line of text under the H1 text on search results as well, maybe with a strong tag. By doing that we will be able to keep the page looking clean and easy to navigate through. Anyways, that's just my two cents. Any ideas/input on this?
On-Page Optimization | | eVenuesSEO0