Robots.txt file
-
Does it serve any purpose if we omit robots.txt file ? I wonder if spider has to read all the pages, why do we insert robots.txt file ?
-
As Ryan said, robots.txt file is very useful when you wanna block (disallow) some pages. Indeed, if you don't want that spider crawls your page you must use robots.txt (noindex tags will let bot crawls, but not index, your page). I have got a small website but i dropped robots.txt in my folder. Maybe write just Allow: / could be useless, but you can say: "I respect protocols"
-
A good source to learn about the robots.txt file is here: http://www.robotstxt.org/
The robots.txt file is completely optional. I don't use the file at all on small sites.
The file offers a means to block crawlers which choose to honor the file's instructions from crawling all or part of a site. It also provides the location of a sitemap.
To that end, sitemaps are completely unnecessary for SEO assuming your site has proper navigation. Even if you choose to use a sitemap, you can offer the location via WMT rather then the robots.txt file.
With respect to blocking areas of your site, the primary use would be for CMS, forums, ecommerce or other sites where the software was limited and does not allow the site owner to use noindex on all pages.
As a rule, robots.txt should simply never be used except as a means of last resort. In my experience the file is overused by site owners and SEOs. One exception where I use a robots.txt is during a site's development when I do not wish the site to be crawled at all.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Robots information
Hi, I have a question about the Meta Robots information Accoarding to the Moz bar, our page uses the meta robots noodp and noydir. Our competitor uses
On-Page Optimization | | AdoBike
INDEX,FOLLOW I read that noodp and noydir are dated and not used anymore. Is it wise to use INDEX FOLLOW instead for better SEO? Thanks in advance!1 -
Will Google Count Links Loaded from JavaScript Files After the Page Loads
Hi, I have a simple question. If I want to put an image with a link to another site like a banner ad on my page, but do not want it counted by Google. Can I simply load the link and banner using jQuery onload from a separate .js file? The ideal result would be for Google to index a script tag instead of a link.
On-Page Optimization | | CopBlaster.com1 -
Internal Linking : File Name or URI/filename.html
Hi, For crawlability, what would be the best way to do internal linking. /aboutus.html or www.xyz.com/aboutus.com Would this make any difference to the website in terms of SEO or bot crawling the website? Regards, Sree
On-Page Optimization | | jungleegames0 -
"translation" of code in htaccess file
Hi everyone! I am a newbie to the whole SEO and html thing and I am trying to get a better understanding of the "behind the scenes" part of my website. I hope I can find someone here who can translate a piece of code for me that I have in my htaccess file: Options -Multiviews
On-Page Optimization | | momof4
Options +FollowSymLinks
rewritecond $1 !^(index.php|public|tmp|robots.txt|template.html|favicon.ico|images|css|uploads)
rewritecond %{REQUEST_FILENAME} !-f
rewritecond %{REQUEST_FILENAME} !-d
rewriterule ^(.*)$ index.php?link=$1 [NC,L,QSA] I know that something is getting redirected to the index file, but what (or when) exactly? Does the word "robots"mean that search engine crawlers are getting redirected here? And is this good or bad (in terms of SEO)? Or is this redirecting people who try to get to my robots/ template or image files?? Thanks in advance for any answers!0 -
Disavow Tool Submitting 2nd File
Hi About 2 Months ago I submitted a Disavow File using the Disavow Tool I have collected more links and I am ready to upload a 2nd File, However should I download the previous file in Webmaster Tools (Disavow Tool) and add these new links to that File or if I just upload and override the existing file with this file containing new links only will that be ok. What I dont want to do is do something that removes all the previous findings from the list so that google cannot see them anymore. I guess what I am trying to say is does Google just refer to the live file I am updating / overriding or once I have submitted a file weather i remove it or not will google still have a record of it and be referring to it ? Thanks Adam
On-Page Optimization | | AMG1000 -
How do i block an entire category/directory with robots.txt?
Anyone has any idea how to block an entire product category, including all the products in that category using the robots.txt file? I'm using woocommerce in wordpress and i'd like to prevent bots from crawling every single one of products urls for now. The confusing part right now is that i have several different url structures linking to every single one of my products for example www.mystore.com/all-products, www.mystore.com/product-category, etc etc. I'm not really sure how i'd type it into the robots.txt file, or where to place the file. any help would be appreciated thanks
On-Page Optimization | | bricerhodes0 -
Photogallery and Robots.txt
Hey everyone SEOMOZ is telling us that there are to many onpage links on the following page: http://www.surfcampinportugal.com/photos-of-the-camp/ Should we stop it from being indexed via Robots.txt? best regards and thanks in advance... Simon
On-Page Optimization | | Rapturecamps0 -
How do you block development servers with robots.txt?
When we create client websites the urls are client.oursite.com. Google is indexing theses sites and attaching to our domain. How can we stop it with robots.txt? I've heard you need to have the robots file on both the main site and the dev sites... A code sample would be groovy. Thanks, TR
On-Page Optimization | | DisMedia0