Overview of the Search Process

According to Google, as of 2014, there are 60 trillion individual pages on the net. Search engines like Google, Yahoo and Bing help us retrieve relevant content from this gigantic information glut, in a timely manner. In order to do this they maintain a page index containing information that their algorithms need, to generate search results.

Search engines use automated programmes called spiders or bots (short for robot e.g. Googlebot) that crawl the web pages from link to link and retrieve data about the pages. The search engines store the information into a massive database called the index. The size of the Google index is over 100 million gigabytes.

As a user seeking information types into the engine’s search box, algorithms interpret the information he is seeking and identify the relevant pages in the index. The search engine then ranks the results based on several factors, and presents the results in rank order to the user in the form of search engine result pages (SERPs).

Securing a high rank on these pages is of such enormous interest that a whole industry has emerged to help marketers optimize their websites.

Previous     Next

Note: To find content on MarketingMind type the acronym ‘MM’ followed by your query into the search bar. For example, if you enter ‘mm consumer analytics’ into Chrome’s search bar, relevant pages from MarketingMind will appear in Google’s result pages.

Digital Marketing Workshop

Digital Marketing Workshop

Two-day hands-on coaching on Digital Marketing and Advertising, to train participants in developing and executing effective digital marketing strategies.