Robots. text file is considered as a useful convention to prevent cooperating web robots and web crawlers from accessing all or part of a website or its content for which we donít want to be crawled and indexed but publicly viewable. It is also employed by search engines to archive and categorize website and to generate a rule of no follow regarding some particular areas of our websites.. pls share your experience..