According to me a sitemap is more important than robots.txt as it help a search engine bot in effectively crawling a website. Robots.txt is generally used to request (allow: or disallow:)a crawler not to crawl and index certain section of your website containing sensitive data. This is totally upto the crawler to respect the request by not crawling and indexing that sensitive part. However, it is a general practice among webmasters world wide to have a robots.txt file for each of their sites. A common robots.txt with permission to access the entire website should look like this:
User-agent: *
Disallow:Sitemap: http://www.yoursite.com/sitemap.xml
So if you want some section (folders, directories) of your site not to be crawled by a bot then you can use a robots.txt.
Yes logically its the same like having a robots.txt file granting all the access and not having one completely. Its just a difference between like something having 'by default". Having a robots.txt file doesn't guarantee a rank boost in the SERP. Hope it helps. For more understanding please refer these resources:
Cheers