use of robots.txt file
What is the use of robots.txt file in context with SEO?
Robots.txt is used for instructing search engines..if you don't want to crawl certain pages then you can stop crawlers.
The robots.txt file is just a simple text file as the file extension suggests. It’s created using a simple text editor like Notepad or WordPad, complicated word processors such as Microsoft Word will only corrupt the file.
Robots.txt is a text file which instructs the web crawler not to crawl, index or visit the pages that are listed in the text file. It is generally kept in the root folder of the website.
Now i know the about robot.txt. Thanks to all. It is very useful information.
A robots.txt file is used to restricts access to your site by search engine robots that crawl the web.
The simplest robots.txt file uses two rules:
User-agent: the robot the following rule applies to
Disallow: the URL you want to block
Basically it is used to tell the crawler which one page or file not to be crawled .
Not exactly true.
It is purely optional on the part of the crawler to observe the robots.txt file.
[CODE]...[/CODE] [HTML]...[/HTML] [PHP]...[/PHP]
If you can't think outside the box, you will be trapped forever with no escape...
Robots.txt is a text file whichis used to guide the web crawler not to crawl, index or visit the pages that are listed in the text file. It is generally kept in the root folder of the website.
Users Browsing this Thread
There are currently 1 users browsing this thread. (0 members and 1 guests)