We should use robot.txt if we want to restrict crawlers to crawl any private pages. You can do it using notepad file and upload it in your server. robots.txt tells google, yahoo, bing crawlers what not to crawl on your pages.
An example of how we use it is the following.
We have a test directory on our server where we upload sites to be tested before we go public. Obviously, we do not want the crawlers to take note of this directory since it is a directory for testing only. Below is the robots.txt file contents.
User-agent: *
Disallow:/testing/