Robots.txt is a text file defined in the website to contain instructions for search engine spiders. They list webpages that are allowed and disallowed from search engine crawling.
The robots.txt is a simple text file in your web site that informs search engine bots how to crawl and index website or web pages. It is great when search engines frequently visit your site and index your content but often there are cases when indexing parts of your online content is not what you want.