Eye tracking is the measurement of point of gaze. Eye trackers (Exhibit 25.2) track the movement of the eye, recording what people see and how long the fixation lasts.
Eye tracking research dates back to the 1800s. Early studies revealed that when people read text their eyes make short rapid movements (saccades) intermingled with short stops (fixations).
The use of eye tracking in market research was constrained because tracking devices used to be intrusive, cumbersome and expensive. Advances, particularly in recent decades, have led to the development of devices that are easy to use in natural settings and allow for precise and objective measurements of eye movements in real-time.
As eye tracking expands into mainstream applications, eye trackers will become inexpensive. That should accelerate their use in consumer research.
Eye trackers illuminate the eye using near-infrared light-emitting diodes (LED). As shown in exhibits 25.3 and 25.4, cameras fitted on the devices record the reflections of the light, and computer algorithms analyse the reflections to reveal the direction of the gaze. The underlying concept is commonly referred to as pupil centre corneal reflection (PCCR).
The wavelength of the light emitted by the LED illuminators is outside the visible spectrum. The light cannot be seen by the user, but it can detected by near-infrared cameras that are fitted onto the eye tracker.
Light that falls on the pupil, enters through the pupil, whereas light that falls on the cornea outside the pupil, is reflected back. The camera therefore, records the pupil as a dark spot and the corneal reflections are brighter. This is referred to as dark pupil eye tracking.
If the LED light source is coaxial with the optical path, then it reflects back off the retina creating a bright pupil effect similar to red eye.
In both dark and bright pupil eye tracking, the reflection of the pupil is different than the rest of the eye. Computer algorithms are able to process the vector (refer Exhibit 25.5) formed by the reflection from the pupil and the cornea, to detect the direction of the gaze.
For tracking real/virtual 3-dimensional space, the cameras need to observe and combine the gaze from both the eyes.
Since our eyes are not identical in size, shape or structure, a simple calibration procedure is required before use.
The two most commonly available eye trackers are the glasses (wearable and mobile) and screen-based (remote or desktop) types.
Glasses (Exhibit 25.6) are quite light, weighing about 50g. Respondents’ eyes can be tracked as they move freely in a real or virtual setting.
In addition to the glasses the device has a wired or wireless recording unit.
Tobii Pro Glasses 2 is a 3D model that comes with 4 eye cameras. The gaze sampling frequency ranges from 50 or 100 Hz.
These devices are ideal for a wide range of consumer research including packaging testing, advertising testing, usage studies, and shopping behaviours.
Screen-based eye trackers (Exhibit 25.7) are usually mounted under a computer screen. These trackers are used in lab settings or controlled location tests (CLT) for testing a wide variety of media including videos and images, for both online and offline content. Applications in consumer research include copy testing and packaging testing.
Eye tracking quantifies visual attention. The most common metrics relate to fixations and saccades. Fixations are the stable points where the eye is looking for a period of say 100 milliseconds or more, and saccades are the movements between fixations.
Gaze point is the basic unit of measure of eye gaze. If the eye tracker has a sampling frequency of 60 Hz, it captures 60 gaze points per second (or one every 16.67 milliseconds).
A cluster of gaze points, close in time and space, constitutes a fixation. While there is no precise cut-off defining a fixation, it is believed that durations less than 300ms do not get encoded in memory. In practice, devices are recoding fixations for periods of 100ms or more.
The eye movements between fixations are called saccades. They are the lines that connect the different fixations.
When viewing an image or reading a text, longer fixation time might mean that the object in focus is more engaging, or it could indicate difficulty in understanding or extracting information. More fixations indicate that the object is more noteworthy, whereas faster time to first-fixation means that it better at grabbing attention.
A scan path or gaze plot describes the user’s gaze pattern as a saccade-fixate-saccade sequence represented by lines and bubbles, where the size of the bubble indicates the fixation duration. The gaze plot in Exhibit 25.8 involves a jerky movement from one fixation to another. This is typical of most visual images and videos. Similarly while reading text, eyes tend to jump and pause.
When driving, on the other hand, eyes fixate constantly on the road or the vehicle in front. There are much fewer saccades and a smoother, more concentrated scan path.
Heatmaps like the one in Exhibit 25.9, represent the areas of the visual that are viewed most. They depict the aggregate gaze points through a colour scheme where warm colours like red and yellow indicate hot spots or areas that are viewed more often and for longer periods.
The “heat intensity” is equal to the proportion of participants fixating the area of interest times the average fixation duration, or time spent.
These metrics are of considerable relevance to marketing analysts. Advertisers, for instance, design layouts so that higher proportion of viewers (proportion fixating) see the value proposition or the key messages in the advertisement. Similarly packaging designers want their packaging to stand out on the shelf. It is important for both that the key elements in their content are red hot spots.
Our pupils typically respond to light, dilating when it is dark and constricting when it is bright.
Pupils also respond to emotional arousal and cognitive or mentally taxing activity. They dilate when the mind is emotionally charged or cognitively stimulated. The extent of dilation is proportional to the intensity of the arousal. It is an autonomic process that cannot be consciously controlled.
So, provided the impact of lighting is accounted for, pupillometry, i.e. the measurement of dilation of the pupil, provides an immediate assessment of emotional arousal.
Note however that pupillary responses do not reveal the nature of the emotions. We need to rely on other techniques in conjunction with eye tracking, for a detailed understanding of emotions.
Besides pupillary responses, eye trackers measure a few other aspects of body language, or more specifically, “eye language”.
They record ocular vergence by tracking inter-pupil distance. Divergence, i.e. eyes moving apart, signals that the mind is drifting, losing focus or concentration.
Eye trackers also monitor frequency of blinks, providing insights into the state of the user’s mind. Delayed or “attentional” blink is indicative of mentally taxing work. Low frequency blinks indicate deeper levels of concentration. Conversely, high frequency blinking is associated with boredom, drowsiness and low concentration levels.
Another body metric, the distance between eyes and screen, reflect the respondent’s posture. If a person leans forward it usually denotes positive resolution or higher level of interest. Conversely, when she leans backwards, it suggests negativity, or the inclination to avoid.
The extent to which eye tracking is used in consumer research is expected to grow rapidly over the next few years. Primary applications will include testing of:
It is well known that cute babies and pretty faces draw attention. However, if that is all that the viewers see, the key messages, call-to-action and value proposition may remain lost in text. The visual may score on viewership, but fail to generate the right kind of attention.
Take Exhibit 25.10 for instance. In this oft quoted example, there is a distinct difference in the heat maps for two similar diaper advertisements. In the upper version of the ad where the baby is facing them, the viewers are focussing mainly on the baby’s face and not devoting much attention to the text.
The lower ad, in contrast, is more successful in drawing attention to the messages in text by means of using a suggestive directional cue. Here the baby is looking towards the text. This subtly draws viewers’ attention to the desired area of the visual.
As can be seen from this example, eye tracking captures attention and quantifies it, so that marketers can assess viewers’ attention to the key elements in their advertisements, packaging and webpages.
The tracking of pupil dilation also provides an assessment of emotional arousal. However, since pupillometry cannot decipher emotional valence, eye tracking should be combined with biometric techniques such as facial coding or EEG, for an understanding of the nature of the emotions.
In the context of websites, eye tracking is used to assess usability and user experience, in terms of the ease with which they can obtain the information they seek.
It is undesirable if viewers spend too much time reading general information or basic instructions. If the fixation time is substantially longer than intended, this could indicate difficulty in understanding the instructions, resulting in deterioration of the users’ experience.
Similarly, the time to first-fixation reveals the extent to which key elements are able to grab viewers’ attention. A key metric for packaging testing the time to first-fixation captures the packaging’s ability to stand out on the shelf.
Eye tracking may also be used for studying the usage of products that require skill and familiarity from the users’ side. For instance, computer games, computer-aided applications, automobiles, airplanes, and a wide range of sports product.
Human computer interaction (HCI), a growing field of research into the usage of computers and computer applications, relies on eye tracking. HCI is helping engineers design superior computers and improve software applications.
Simulators for automobiles (or airplanes) use eye tracking and other sensors to understand how drivers drive, and how they respond to danger and obstacles on the road. The devices can reveal how speeding and reckless driving can impair visual attention. They help product designers build products that are safer and easier to use.
Eye trackers also have the scope to become an integral part of products that could make use of the information these devices capture. In the future, perhaps an automobile with eye tracking can be designed to respond to the drivers’ gaze, eye movements or pupil dilation. For instance, if the eyes are off the road or if the driver is falling asleep, the vehicle may be designed to respond so as to avert an accident.
There is scope for use of eye tracking in interactive television and in applications that are interactive in nature.
There is great scope too, largely untapped for now, in virtual reality simulation. Eye tracking makes virtual reality seamless with the eyes selecting what the user wants to see and where the user wants to go. This substantially enhances user’s experience, making virtual simulation closer to reality and consequently better suited for consumer research in areas such as simulated store tests and virtual in-store shopping behaviour studies.
Importantly, while the eye tracker shows where eyeballs are heading, it cannot reveal what the mind perceives. Sometimes the mind is so preoccupied, that the eyes do not register what lies in the direction of their ray of sight. For instance, when someone looks right through you. Or for instance when a banner ad is the viewer’s line of sight, but she is not perceiving it.
To understand what respondents perceive and how they feel, we need to continue to rely on existing research and analytic techniques in quantitative and qualitative consumer research as well as adopt newer tools like EEG and facial coding.
Eye tracking is used in conjunction with EEGs to identify the elements in a visual that stir emotions. And with facial coding devices to reveal the aspects that evoke facial expressions. Details about these techniques are covered in this chapter.
Details of how eye tracking may be used in conjunction with other biometric devices, are covered in the section Applications of Biometrics in Marketing.
Note: To find content on MarketingMind type the acronym ‘MM’ followed by your query into the search bar. For example, if you enter ‘mm consumer analytics’ into Chrome’s search bar, relevant pages from MarketingMind will appear in Google’s result pages.