Web Crawler

A web crawler is a computer program that browses the internet. Other terms for web crawlers are ants, automatic indexers, bots, web spiders, web robots. Search engine’s send out web crawlers, a process called Web crawling or spidering. Many sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches. Crawlers can be used to gather specific types of information from Web pages, test links and validate HTML code.