A web crawler also known as a web spider or web robot is a program which browses the World Wide Web in a methodical, automated manner.This process is called web crawling or spidering. Many sites, in particular search engines, use spidering as a means of providing up-to-date data. Web crawlers are mainly used to create a copy of all the visited pages for later processing by a search engine that will index the downloaded pages to provide fast searches.