The robots.txt file is designed to tell search engines whether or not that can crawl your site, any restrictions they have in their crawl and any delay required between each load to reduce server load. Search Engines love to go as fast as possible and that's ideal for webmasters too, but if your host can keep up, a group of engines all crawling at the same time could cause the site to get knocked offline. So consider this when calculating whether or not you need a delay in action.