Robert.txt

Robots.txt is a text file present in the root directory of a website. It helps to search engine spiders for which files to crawl on or not.
 
Robots.txt is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
 
robot .txt is used to ristrict crawler from indexing our site pages. some pages in our website are not neccessory to show on search engine result page ,so we ristrict searcg engine crawler from indexing that page. the examples of these pages are terms and condition page and privacy policy page.
 
Back
Top