What Is a Googlebot?
Googlebot is a software program that constantly discovers and indexes new content on the Internet to keep Google's up to date, and to also improve the relevancy of its search engines. Often referred to as "spiders," Googlebots are the programs that crawl through sites to index them.
Google uses a massive collection of computers to crawl though billions of web pages. In order to coordinate such a monumental undertaking, Googlebot uses a complicated algorithmic process -- programs determine what sites should get crawled, how often they should get crawled, and how many pages should get crawled from each site.
The crawling process first begins with a list of webpage URLs that was generated from Googlebot's previous crawls and then augmented with webmaster provided sitemap data. When Googlebot goes to each of the pages on its list, it detects links and then adds them to the list of pages that need to get crawled. Googlebot notes any newly created websites, newly added webpages, dead links, and any other changes to the Internet it finds and then makes the appropriate updates to Google's index.
Essentially, Googlebot is what builds Google's database, and what Search Engine Optimizers hope to attract.