Robots.txt Question
-
For our company website faithology.com we are attempting to block out any urls that contain a ? mark to keep google from seeing some pages as duplicates.
Our robots.txt is as follows:
User-Agent: * Disallow: /*? User-agent: rogerbot Disallow: /community/ Is the above correct? We are wanting them to not crawl any url with a "?" inside, however we don't want to harm ourselves in seo. Thanks for your help!
-
You can use wild-cards, in theory, but I haven't tested "?" and that could be a little risky. I'd just make sure it doesn't over-match.
Honestly, though, Robots.txt isn't as reliable as I'd like. It can be good for preventing content from being indexed, but once that content has been crawled, it's not great for removing it from the index. You might be better off with META NOINDEX or using the rel=canonical tag.
It depends a lot on what parameters you're trying to control, what value these pages have, whether they have links, etc. A wholesale block of everything with "?" seems really dangerous to me, IMO.
If you want to give a few example URLs, maybe we could give you more specific advice.
-
if I were you I would want to be 100% sure I got it right. This tool has never let me down and the way you have Roger bot he may be blocked.
Why not use a free tool from a very reputable company to make your robot text perfect
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
http://www.searchenginepromotionhelp.com/m/robots-text-tester/
then lastly to make sure everything is perfect I recommend one of my favorite free tools up to 500 pages is as many times as you want that costs I believe $70 a year
http://www.screamingfrog.co.uk/seo-spider/
his one of the best tools on the planet
while you're at Internet marketing ninjas website look for other tools they have loads of excellent tools that are recommend here.
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
Sincerely,
Thomas
-
Yes you can
Robots.txt Wildcard Matching
Google and Microsoft's Bing allow the use of wildcards in robots.txt files.
To block access to all URLs that include a question mark (?), you could use the following entry:
User-agent: *
Disallow: /*?You can use the $ character to specify matching the end of the URL. For instance, to block an URLs that end with .asp, you could use the following entry:
User-agent: Googlebot
Disallow: /*.asp$More background on wildcards available from Google and Yahoo! Search.
More
http://tools.seobook.com/robots-txt/
hope I was of help,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Best Practices regarding Robots.txt disallow
I cannot find hard and fast direction about the following issue: It looks like the Robots.txt file on my server has been set up to disallow "account" and "search" pages within my site, so I am receiving warnings from the Google Search console that URLs are being blocked by Robots.txt. (Disallow: /Account/ and Disallow: /?search=). Do you recommend unblocking these URLs? I'm getting a warning that over 18,000 Urls are blocked by robots.txt. ("Sitemap contains urls which are blocked by robots.txt"). Seems that I wouldn't want that many urls blocked. ? Thank you!!
Intermediate & Advanced SEO | | jamiegriz0 -
Question about getting domain name re-indexed
I recently swapped my domain from www.davescomputers.com to www.computer-help.com . Originally www.computer-help.com was 301 re-directing to www.davescomputers.com ...however my long term goal is to eventually rebrand my business so I decided to utilize the other domain by swapping the main domain. Is consistant blogging the best way to get Google to re-index the entire website? My focus has been quality posts and sharing them with vairus social profiles I created.
Intermediate & Advanced SEO | | DavidMolnar0 -
Ecommerce Rich Snippets Category Question
Hello Technical question: I have category pages in my site - and product pages in my site. Category Pages I Have listed Grouped products on them ( i.e. a product that could have multiple offers for each size ) I Have the relevant markup for them ( one for each product ) . Product Pages The Product pages have grouped products as well - but the individual simple products are note marked up. The Product pages have Aggregate reviews on them. My question is this: Should I use the grouped product sku to identify the grouped product on the category page and the product page? Technically they aren't Stock Keeping Units ( SKU's ) - but they could be used to link the 2 together to avoid duplicates. Ideas?
Intermediate & Advanced SEO | | s_EOgi_Bear0 -
Should I be using meta robots tags on thank you pages with little content?
I'm working on a website with hundreds of thank you pages, does it make sense to no follow, no index these pages since there's little content on them? I'm thinking this should save me some crawl budget overall but is there any risk in cutting out the internal links found on the thank you pages? (These are only standard site-wide footer and navigation links.) Thanks!
Intermediate & Advanced SEO | | GSO0 -
Metatags on drupal question
Hi Im quite inexperienced on drupal (normally an umbraco user!) and im having some difficulty with the Metatags on the CMS. I have been applying Meta Title and descriptions to the individual pages however they only appear when i preview the page and not when the page is saved. When i go into the metatag section located at /admin/config/search/metatags i am given a list of settings including Global: Front Page and Node. Im sure the reason it keeps defaulting the metatags back is to do with this but im not sure what to change to apply my own Thanks in advance
Intermediate & Advanced SEO | | TheZenAgency1 -
Paging Question: Rel Next or Canonical?
Hi, Lets say you have a category which displays a list of 20 products and pagination of up to 10 pages. The root page has some content but when you click through the paging the content is removed leaving only the list of products. Would it be best to apply a canonical tag on the paging back to the root or apply the prev/next tags. I understand prev/next is good for say a 3 part article where each page holds unique content but how do you handle the above situation? Thanks
Intermediate & Advanced SEO | | Bondara0 -
Canonical or 301 redirect, that is the question?
So my site has duplicate content issues because of the index.html and the www and non www version of the site. What's the best way to deal with this without htaccess? Is it a 301 redirect or is it the canonical, or is it both?
Intermediate & Advanced SEO | | bronxpad0 -
Google Listing & Description Categories Question
How do you set up a website or home page to get the individual listing below the description text. For example with the SEO moz listing there are many of the categories listed below the description. Such as SEO Blog, SEO tools ... Is there a way to add this to a home page so good will pick it up this way? I attached a screen shot of what I am talking about because I think I am likely using the wrong terminology. Npcpu.png
Intermediate & Advanced SEO | | fertilityhealth0