Robots.txt

jaysh4922

New member
Robots.txt file is a special text file that is for eternity located in your Web server's root directory. Robots.txt file contains limits for Web Spiders, telling them where they have authorization to search. A Robots.txt is like significant rules for search engine spiders what to follow and what not to. Robots.txt file is a very important file if you want to have a good ranking on search engines.
 
Robots.txt file is a one text file. It is used to restrict some part of your site. If user doesn't want to crawl some parts of website or any URL or any folder so user can use disallow command in robots.txt file. So crawler doesn't crawl those part of site.

So, the purpose of the Robots.txt is to instruct crawler which part should be crawl and which shouldn't be crawl.
 
Robots.txt is a text file when a search engine crawls(indexes) your website, the first thing most of them look for is your robots.txt. It also may indicate the location of your XML sitemap. The search engine then sends its "bot" or "robot" or spider to crawl your site as directed in the robots.txt file.
 
it is the text file to placed on server which contain a list of robots and disallows for those robots each disallow prevents any address that finds disallows strings from being accessed. If you cannot access your server's root location you will not be able to use a robots.txt file to exclude pages from your index.
 
Back
Top