According to various estimates, there are several trillion 
    individual pages on the internet. Search engines like Google, Yahoo and Bing help us retrieve 
    relevant content from this gigantic information glut, in a timely manner. To do this, they 
    maintain a page index containing information that their algorithms need, to generate search 
    results.
Search engines use automated programmes called spiders or bots (short for robot e.g., Googlebot) 
    that crawl the webpages from link to link and retrieve data about the pages. The search engines store the 
    information into a massive database called the index. The Google Search index contains “hundreds of 
    billions” of webpages and is well over 100,000,000 gigabytes in size.
As a user seeking information types into the engine’s search box, 
    algorithms interpret the information the user is seeking and identify the relevant pages in the index. The search 
    engine then ranks the results based on several factors and presents the results in rank order to the user 
    in the form of search engine results page (SERP). 
Securing a high rank on these pages is of such enormous interest that 
    a completely new industry emerged in the early 1990s. Referred to as search engine optimization (SEO), it helps 
    marketers optimize their websites for search.