Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
How to exclude URL filter searches in robots.txt
-
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505
How can I exclude all of these filters in the robots.txt? I think it'll be:
Disallow: /*?color=$
Is that the correct syntax with the $ sign in it? Thanks!
-
Unless you're specifically calling out Bing or Baidu... in your Robots.txt file they should follow the same directives as Google so testing with Google's Robots.txt file tester should suffice for all of them.
-
Yes, but what about bing and rest of Search Engine?
-
Adrian,
I agree that there certainly is a right answer to the question posted, as the question asks specifically about one way to manage the issue, being a block of filters in the robots.txt file. What I was getting at is that this may or may not necessarily be the "best" way, and that I'd need to look at your site and your unique situation to figure our which would be the best solution for your needs.
It is very likely that with these parameters a robots.txt file block is the best approach, assuming the parameters aren't added by default into category page or category pagination page navigational links, as then it would affect the bot's ability to crawl the site. Also, if people are linking to those URLs (highly unlikely though) you may consider a robots meta noindex,follow tag instead so the pagerank could flow to other pages.
And I'm not entirely sure the code you provided above will work if the blocked parameter is the first one in the string (e.g. domain.com/category/?color=red) as there is the additional wildcard between the ? and the parameter. I would advise testing this in Google Webmaster Tools first.
- On the Webmaster Tools Home page, click the site you want.
- Under Crawl, click Blocked URLs.
- If it's not already selected, click the Test robots.txt tab.
- Copy the content of your robots.txt file, and paste it into the first box.
- In the URLs box, list the site to test against.
- In the User-agents list, select the user-agents you want (e.g. Googlebot)
-
There certainly is a right answer to my question - I already posted it here earlier today:
Disallow: /*?color=
Disallow: /?*manufacturer=Without the $ at the end which would otherwise denote the end of the URL.
-
Hello Adrian,
The Moz reports are meant to help you uncover issues like this. If you're seeing non-canonical URLs in the Moz report then there is a potential issue for Google, Bing and other search engines as well.
Google does respect wildcards (*) in the robots.txt file, though it can easily be done wrong. There is not right or wrong answer to the issue of using filters or faceted navigation, as each circumstance is going to be different. However, I hope some of these articles will help you identify the best approach for your needs:
(Note: Faceted Navigation is not exactly the same as category filters, but the issues and possible solutions are very similar
)Building Faceted Navigation That Doesn't Suck Faceted Navigation Whiteboard Friday
Duplicate Content: Block, Redirect or Canonical
Guide to eCommerce Facets, Filters and Categories
Rel Canonical How To and Why Not
Moz.com Guide to Duplicate ContentI don't know how your store handles these (e.g. does it add the filter automatically, or only when a user selects a filter?) so I can't give you the answer, but I promise if you read those articles above you will have a very good understanding of all of the options so you can choose which is best for you. That might end up being as simple as blocking the filters in your robots.txt file, or you may opt for rel canonical, noindex meta tag, ajax, Google parameter handling, etc...
Good luck!
-
It's not Google's index that I'm interested in in this case, it's for the MOZ reports. Moz was including over 10,000 'pages' because it was indexing these URLs. Now I know how to edit the robots.txt Moz will be prevented from indexing them again (we only have around 2,000 real pages, not 10,000)
-
I sought out the answer from a developer and got the following reply, so posting here in case it helps someone else:
To exclude pages with color or manufacture in them you can use
Disallow: /*?color=
Disallow: /?*manufacturer=A question mark in your try should be omitted as it denotes the end of the url
-
Hi
I would recommend excluding these in Google Webmaster Tools. Once logged in to your account under the "Crawl" menu you will find "URL Parameters". Find the relevant parameter in the list on this page and you can tell Google not to index these pages.
Hope this helps.
Steve
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Query for paginated URLs - Shopify
Hi there, /collections/living-room-furniture?page=2
On-Page Optimization | | williamhuynh
/collections/living-room-furniture?page=3
/collections/living-room-furniture?page=4 Is that ok to make all the above paginated URLs canonicalised with their main category /collections/living-room-furniture Also, does it needs to be noindex, follow as well? Please advice, thank you!1 -
URL Structure on Category Pages
Hi, Currently, we having the following URL Structure o our product pages: All Products Pages: www.viatrading.com/wholesale/283/All_Products.html Category Page: www.viatrading.com/wholesale/4/Clothing.html Product Page: www.viatrading.com/wholesale/product/LOAD-HE-WOM/Assorted-High-End-Women-Clothing-Lots.html?cid=4 Since we are going to use another frontend system, we are thinking about re-working on this URL Structure, using something like this: All Products Pages: www.viatrading.com/wholesale-products/ Category Page: www.viatrading.com/wholesale-products/category/ Product Page: www.viatrading.com/wholesale-products/category/product-title/ I understand this is better for SEO and user experience. However, we already have good traffic on the current URL Structure. Should we use same left-side filters on Category Pages as in All Products Page? Since we are using Faceted Navigation, when users filter the Category (e.g. Clothing) they will see same page as Clothing Category Page. Is that an issue for Duplicate Content? Since we are a wholesale company - I understand is using "/wholesale/products/" in URL for all product pages a good idea? If so, should we avoid word "wholesale" in product-title to avoid repeated word in URL? For us, SKU in URL helps the company employees and maybe some clients identify the link. However, what do you think of using the SEO-friendly product-title, and 301 redirect it to www.viatrading.com/BRTA-LN-DISHRACKS/, so 1st link is only used by company members and Canonicalized 2nd is the only one seen by general public? Thank you,
On-Page Optimization | | viatrading10 -
Dynamic URL Parameters + Woocommerce create 404 errors
Hi Guys,
On-Page Optimization | | jeeyer
Our latest Moz crawl shows a lot of 404-errors for pages that create dynamical links for users? (I guess it are dynamic links, not sure). Situation: On a page which shows products from brand X users can use the pagination icons on the bottom, or click on: View: / 24/48/All.
When a user clicks 48 the end of the link will be /?show_products=48
I think there were some pages that could show 48 products but do not exist anymore (because products are sold out for example), and that's why they show 404's and Moz reports them. How do I deal with these 404-errors? I can't set a 301-redirect because it depends on how many products are shown (it changes every time).
Should I just ignore these kind of 404-errors? Or what is the best way to handle this situation?0 -
Using keywords in my URL: Doing a redirect to /keyword
My website in "On Page Grade" received an A.Anyway, I only have 1 thing to optimize:_"Use Keywords in your URL__Using your targeted keywords in the URL string adds relevancy to your page for search engine rankings, assists potential visitors identify the topic of your page from the URL, and provides SEO value when used as the anchor text of referring links."_My website is ranking in top10 for a super high competitive keyword and all my others competitors have the keyword on their domain, but not for my URL.Since I can't change my domain for fixing this suggestion, I would like to know what do you think about doing a 301 redirect from / to mydomainname.com/keyword/So the index of my website would be the /keyword.I don't know if this can make a damage to my SERP for the big change ir it would be a great choice.
On-Page Optimization | | estebanseo0 -
Backlink URL: With or Without WWW?
When it comes to backlinks. Does it matter with or without WWW? For example my website is without WWW and I backlink with WWW, will it still affect my website rank?
On-Page Optimization | | Japracool0 -
How to Structure URL's for Multiple Locations
We are currently undergoing a site redesign and are trying to figure out the best way to structure the URL's and breadcrumbs for our many locations. We currently have 60 locations nationwide and our URL structure is as follows: www.mydomain.com/locations/{location} Where {location} is the specific street the location is on or the neighborhood the location is in. (i.e. www.mydomain.com/locations/waterford-lakes) The issue is, {location} is usually too specific and is not a broad enough keyword. The location "Waterford-Lakes" is in Orlando and "Orlando" is the important keyword, not " Waterford Lakes". To address this, we want to introduce state and city pages. Each state and city page would link to each location within that state or city (i.e. an Orlando page with links to "Waterford Lakes", "Lake Nona", "South Orlando", etc.). The question is how to structure this. Option 1 Use the our existing URL and breadcrumb structure (www.mydomain.com/locations/{location}) and add state and city pages outside the URL path: www.mydomain.com/{area} www.mydomain.com/{state} Option 2 Build the city and state pages into the URL and breadcrumb path: www.mydomain.com/locations/{state}/{area}/{location} (i.e www.mydomain.com/locations/fl/orlando/waterford-lakes) Any insight is much appreciated. Thanks!
On-Page Optimization | | uBreakiFix0 -
Post url not matching with post title ( wordpress)
I have this site called searchoflife.com on which I have noticed the post urls are not matching with the post title. For Example:Post Title - A Dialogue With NaturePost URL - http://searchoflife.com/dialogue-nature-2013-09-12 Words like 'A' and 'with' are not present in the post URL. This has been the trend since a few days. After investigating I found out that it was due to my plugin SEO ultimate which is actually creating post slugs automatically for the post urls. So my question is whether it is advisable to use post slugs instead of the full post url. Does it affect the SERPS for my site?
On-Page Optimization | | toxicpls0 -
Question about url structure for large real estate website
I've been running a large real estate rental website for the past few years and on May 8, 2013 my Google traffic dropped by about 50%. I'm concerned that my current url structure might be causing thin content pages for certain rental type + location searches. My current directory structure is:
On-Page Optimization | | Amped
domain.com/home-rentals/california/
domain.com/home-rentals/california/beverly-hills/
domain.com/home-rentals/california/beverly-hills/90210/
domain.com/apartment-rentals/california/
domain.com/apartment-rentals/california/beverly-hills/
domain.com/apartment-rentals/california/beverly-hills/90210/
etc.. I was thinking of changing it to the following:
domain.com/rentals/california/
domain.com/rentals/california/beverly-hills/
domain.com/rentals/california/beverly-hills/90210/ ** Note: I'd provide users the ability to filter their results by rental type - by default all types would be displayed. Another question - my listing pages are currently displayed as:
domain.com/123456 And I've been thinking of changing it to:
domain.com/123456-123-Street-City-State-Zip Should I proceed with both changes - one or the one - neither - or something else I'm not thinking of? Thank you in advance!!0