Hello,
A web crawler (also known in other terms like ants, automatic indexers, bots, web spiders, web robots) is an automated program, or script, that methodically scans or “crawls” through web pages to create an index of the data it is set to look for. This process is called Web crawling or spidering.
Search engine for locating information on WWW, it indexes all the words in a document, adds them to a database, then follows all hyperlinks and indexes and adds that information also to the database. Also called web spider, spider.