Robots.txt Question
-
For our company website faithology.com we are attempting to block out any urls that contain a ? mark to keep google from seeing some pages as duplicates.
Our robots.txt is as follows:
User-Agent: * Disallow: /*? User-agent: rogerbot Disallow: /community/ Is the above correct? We are wanting them to not crawl any url with a "?" inside, however we don't want to harm ourselves in seo. Thanks for your help!
-
You can use wild-cards, in theory, but I haven't tested "?" and that could be a little risky. I'd just make sure it doesn't over-match.
Honestly, though, Robots.txt isn't as reliable as I'd like. It can be good for preventing content from being indexed, but once that content has been crawled, it's not great for removing it from the index. You might be better off with META NOINDEX or using the rel=canonical tag.
It depends a lot on what parameters you're trying to control, what value these pages have, whether they have links, etc. A wholesale block of everything with "?" seems really dangerous to me, IMO.
If you want to give a few example URLs, maybe we could give you more specific advice.
-
if I were you I would want to be 100% sure I got it right. This tool has never let me down and the way you have Roger bot he may be blocked.
Why not use a free tool from a very reputable company to make your robot text perfect
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
http://www.searchenginepromotionhelp.com/m/robots-text-tester/
then lastly to make sure everything is perfect I recommend one of my favorite free tools up to 500 pages is as many times as you want that costs I believe $70 a year
http://www.screamingfrog.co.uk/seo-spider/
his one of the best tools on the planet
while you're at Internet marketing ninjas website look for other tools they have loads of excellent tools that are recommend here.
http://www.internetmarketingninjas.com/seo-tools/robots-txt-generator/
Sincerely,
Thomas
-
Yes you can
Robots.txt Wildcard Matching
Google and Microsoft's Bing allow the use of wildcards in robots.txt files.
To block access to all URLs that include a question mark (?), you could use the following entry:
User-agent: *
Disallow: /*?You can use the $ character to specify matching the end of the URL. For instance, to block an URLs that end with .asp, you could use the following entry:
User-agent: Googlebot
Disallow: /*.asp$More background on wildcards available from Google and Yahoo! Search.
More
http://tools.seobook.com/robots-txt/
hope I was of help,
Tom
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Twitter Robots.TXT
Hello Moz World, So, I trying to wrap my head around all of the different robots.txt. I decided to dive into a site like Twitter, and look at their robot text. And now, I'm super confused. What are they telling the search engines with /hasttag/*src=. Why don't they just use: Useragent: * Disallow: But, they address each search engine. Is there any benefit to this? Thanks for all of the awesome responses!!! B/R Will H.
Intermediate & Advanced SEO | | MarketingChimp100 -
Ecommerce SEO URL Structure Questions
| I am in the process of developing a new Magento ecommerce store. Take for instance this website is in the apparel industry and i have the following main categories. Clothing Shoes Accessories Beauty Sub categories for clothing would be: Dresses Pants jeans Tops Products would be: Kelly Maxi dresses What is the best SEO Structure for this? Main categories obviously: www.example.com/clothing Sub Categories:
Intermediate & Advanced SEO | | WayneRooney
www.example.com/clothing/dresses Or www.example.com/dresses (Zappos seem to pursue the second type) Products:
www.example.com/clothing/dresses/kelly-maxi-dresses/ Or www.example.com/kelly-maxi-dresses ? Which one would be the best way to structure your site? Also what about filters that available in category pages? Say if i were to filter by color. what would be the best URL? I am sure canonical tag is needed here. New to Ecommerce SEO so i need some guidance! |0 -
Duplicate Content Question
We are getting ready to release an integration with another product for our app. We would like to add a landing page specifically for this integration. We would also like it to be very similar to our current home page. However, if we do this and use a lot of the same content, will this hurt our SEO due to duplicate content?
Intermediate & Advanced SEO | | NathanGilmore0 -
Infographic question
I am about to post my first Infographic and have a question. The graphic is fairly long and was wondering, is it better to split this graphic up in to chunks? So that it loads in stages? I am new to this and would be great if someone could point me to the latest and best practices for infographics. I have seen a few articles but they appear to be old. Thanks for your help
Intermediate & Advanced SEO | | JohnPeters0 -
Anchor text questions - What are your thoughts?
Hi, I want to talk about anchor text and the effect it has on search engines (Good & Bad). Here is a fictitious example we can talk about: www.supercoolrunningsneakers.com Title Tag: Running Sneakers - The Super Trendy Running Trainers Keywords targeted: Running Sneakers, Running Trainers How would you vary your anchor text to target these terms? If you had 50 unique articles to play with how would you vary the anchor text using the articles? Would you push 25 articles at 'Running Sneakers' and the other 25 at 'Running Trainers' or would you link some articles using the domain name anchor text? Q: I'm guessing running 50 articles using anchor text 'Running Sneakers' would benefit the SERP's moe for that term then mixing it up with say 'Running Sneakers', 'Running Trainers' & 'Domain Name Links'. Cheers
Intermediate & Advanced SEO | | activitysuper0 -
How to Disallow Tag Pages With Robot.txt
Hi i have a site which i'm dealing with that has tag pages for instant - http://www.domain.com/news/?tag=choice How can i exclude these tag pages (about 20+ being crawled and indexed by the search engines with robot.txt Also sometimes they're created dynamically so i want something which automatically excludes tage pages from being crawled and indexed. Any suggestions? Cheers, Mark
Intermediate & Advanced SEO | | monster990 -
How To Internationalize - Big Question
Hi all, Here is a big question. We have a long-established good content website with a .co.uk domain. The site is UK focussed. However, we are planning a new feature which will be UK and worldwide. So do we: 1. Keep it all on our .co.uk ? 2. Put the non-UK parts on a .com domain ? We don't have any content as such for a separate domain, and are not planning any. But, we are not sure if for example US users would be unimpressed with a UK domain. We could fudge it with "co.uk/us" etc. (Notice how we have not mentioned Google. Fed-up chasing big G the whole time. We just want to concentrate on our users and the service we provide to them. But G remains the elephant crapping in the corner of the room.) Also, we are asking this question before we let our developers and designers get to work. Basically we value Moz community opinions over and above theirs. Realise this is a big question, but you have big brains. Please chip in.
Intermediate & Advanced SEO | | dexm100 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0