How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Url permalink structure question!
Hello! I just read the "15 SEO Best Practices for Structuring URLs" but I have still a question: My A version bellow "20 accessoires" has no meaning in french. If I add "voyage" (version B), then is it considered as spammy? I mean the "voyage" keyword repetition? A) http://www.lytchee.com/preparer-son-voyage/20-accessoires/ B) http://www.lytchee.com/preparer-son-voyage/20-accessoires-voyage Thanks for advices! Sylvain
On-Page Optimization | | lytcheetv0 -
Home page keyword in url
I have been looking into SEO for a few weeks now trying to perfect a homepage. Going through various sources on MOZ, and other examples out there on the internet, I keep seeing that you should have your keyword in the URL of the page. The homepage is the page most people want to rank the highest in google searches, however, you cannot put the keyword in the URL as most home page URLs are simply /. Should I actually make the home like this: www.example.com/key-word-example? I would imagine this would not be the normal for many users and would seem like it's not the home page.
On-Page Optimization | | Matthew_smart0 -
URL Path. What is better for SEO
Hello Moz people, Is it better for SEO to have a URL path like this: flowersite.com/anniversary_flowers/dozen_roses OR flowersite.com/dozen_roses Is it better to have the full trail of pages in the URL?
On-Page Optimization | | CKerr0 -
Paginated URLs are getting Indexed
Hi, For ex: - My site is www.abc.com and Its paginated URLs for www.abc.com/jobs-in-delhi are in the format of : www.abc.com/jobs-in-delhi-1, www.abc.com/jobs-in-delhi-2 and vice versa also i have used pagination tags rel=next and rel=prev. My concern is all the paginated URLs are getting indexed so is their any disadvantage if these URLs are getting indexed as somewhere i have read that link juice may get distributed in case of pagination. isn't it good to use Noindex, Follow so that we can make the Google to understand that paginated page are not so much important and that should not be ranked.
On-Page Optimization | | vivekrathore0 -
Indexed/Submitted URLS vs Total Indexed
Hello, My site is www.colbysphotography.com. I have Total Indexed 195 while I have 87 URLs submitted and only 79 URLs Indexed. What is the difference and is there a problem? Thanks ahead of time,
On-Page Optimization | | littlecolby
Colby0 -
Mixing hyphens and underscores in a url
Hello. I am working on a site that was built with underscores in the urls, but only in the page names, not in the subdirectories. All the subdirectories have one-word names. So a typical url is "example.com/sub1/sub2/page_name." We would like to change the name of one of the subdirectories to a name that would be very useful for SEO, but this new name is a hyphenated word, let's call it "new-sub." If we changed "sub2" to "new-sub" then our url would have a mix of underscores and hyphens: example.com/sub1/new-sub/page_name. But if I used "new_sub" instead, google would read the words as connected with an underscore, instead of reading the subdirectory as a hyphenated word, which would be less useful for SEO. It seems like it might be a problem to have a hyphen in a subdirectory and underscores in the page names. But I want the SEO value of the hyphenated word. Any recommendations? Thank you!
On-Page Optimization | | nyc-seo0 -
Blocked By Meta Robots
Hi I logged in the other day to find that over night I received 8347 notices saying certain pages are being kept out of the search engine indexes by meta-robots. I have not changed my robots.txt in years and I certainly didn't block Google from visiting those pages. Is this a fault on Roger Mozbot behalf? Or is there a bot preventing 8000+ pages being indexed? Is there a way to find out what meta-robot is doing this and where? And how I can get rid of it? I usually rank between #3 and #5 for the term 'sex toys' on google.com.au, but I now rank #7 to #9 so it would seem some of my pages/content is being blocked. My website is www.theloveshop.com THIS IS AN ADULT TOYS SITE. There is no porn videos or anything like that on it, but just in case you don't wish to look at sex toys or are around kids I thought I would mention it. Blake
On-Page Optimization | | wayne10 -
Can you optimize for 2 keywords per URL?
Or should you just stick to 1 page, 1 keyword all the time? If you do 2, are there any things you should watch out for? Thanks
On-Page Optimization | | inhouseninja0