Firstly, let me correct you, it's not Robert.txt it's Robost.txt. It helps to tell crawlers which page of your website need to be crawled and which should be avoided. Even you can assign the crawl time.
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
robot .txt is used to ristrict crawler from indexing our site pages. some pages in our website are not neccessory to show on search engine result page ,so we ristrict searcg engine crawler from indexing that page. the examples of these pages are terms and condition page and privacy policy page.
The robots.txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl. Let's say a search engine is about to visit a site.