A robots.txt file is used to restricts access to your site by search engine robots that crawl the web.
The simplest robots.txt file uses two rules:
User-agent: the robot the following rule applies to
Disallow: the URL you want to block
Basically it is used to tell the crawler which one page or file not to be crawled .