If you have Directory Listing set to ON then anyone can start poking around your folders if they can read your site code.
If you have Directory Listing set to OFF then no one can see any folders other than what is visible in URL's, so if you have a folder with sub folders, the only thing that is exposed is the path, not the folder.
You can also change the properties of the folders so that files are not visible to external viewers but are accessible to the server for use in serving up but make public visibility impossible to see any files, even with directory listing turned on, if the file permissions are set in a folder, then it is that which takes priority over a general directive to list directory contents.
robots.txt won't really help, it is a voluntary code of practice, if a web bot wants to crawl your site, then it can ignore anything requested to be ignored, only real fireproof method is to set directory permissions for folders and files you want secured and turn off directory listing and expose as little to the outside world as possible.
Use .htaccess rewrite rules to monitor for direct file and directory access and redirect that action as you see fit.