Personal tools
You are here: Home 2000 November The Hunt for Usability: Tracking Eye Movements
Document Actions

The Hunt for Usability: Tracking Eye Movements

Introduction Incorporation of eye position recording into product usability evaluation can provide insights into human-computer interaction that are not available from traditional usability testing methods. We present here some thoughts on this topic which arose primarily from a CHI 99 workshop. This workshop brought together human-computer interaction designers, eye movement researchers and usability testing specialists for a discussion about how to extract information about product usability from users’ eye movements.

Keith S. Karn, Steve Ellis, and Cornell Juliano


Why Collect Eye Movement Data?

Usability testing methods have not changed significantly since the origins of the practice. Usability studies typically address human performance at a readily observable task-level, including measures like time to complete a task, percentage of participants succeeding, type and number of errors, and subjective ratings of ease of use (see Dumas & Redish, 1994; Rubin, 1994 and Nielsen & Mack, 1994). Certain types of questions are difficult to answer efficiently with these techniques. Imagine, for example, that we observe users spending longer than expected looking at a particular software application window or a web page without making the appropriate selection to reach their goal. After situations such as these, participants often have difficulty reconstructing their thought pattern after the task or even verbalizing their thought processes in a "think aloud" protocol during the task. The experimenter has no idea what went wrong. Is it because the user overlooked the appropriate control or hyperlink? Did another visual element in the interface – perhaps an animated graphic – distract the user? Did users see the control, but fail to comprehend its meaning? Did they look at the corporate branding elements? Different answers to these questions would clearly lead to different design modifications. If overlooking the control is the problem, increasing its salience might be appropriate. If confusion over the control’s function is a problem, changing the graphic or text label may be appropriate. If distraction is a problem, decreasing the salience of some elements may be required. Without knowing the answers to these questions, design recommendations have to be implemented by trial and error, adding to development time and cost.

Recording the movements of participants' eyes during task performance can offer additional information that may help us answer these kinds of questions and reduce trial and error in user interaction design. Eye movement recordings can provide a record of the pattern of fixations (see Figure 1), the time spent looking at various display elements, and insight into the deployment of visual attention.

Caption:

Figure 1. A scan path from one participant using a menu in a software application. Dots represent fixations. Connecting lines represent saccadic eye movements between fixations. Image courtesy of Richard Young.

-----------------------------------

While the concept of collecting eye position data as part of usability assessment is not new, its use has been confined primarily to aircraft cockpit issues (see Fitts et al. 1950; Harris, 1980; Sanders et al., 1979 and Simmons, 1979). Only recently has eye tracking technology advanced to make it practical in the broader usability community. Important challenges remain, including: incorporating eye position data collection in usability tests, interpreting eye position data, and resolving technical issues with eye tracking technology.

Participants

The overflow of workshop applications demonstrates the excitement in this topic. We selected participants based on interest and familiarity with the issues as well as on potential for innovative contributions and timely submission of applications. We had a diverse group of participants from four countries. Several said that the workshop was their primary reason for attending the CHI conference. Participants came from a variety of professional backgrounds including computer science, usability evaluation, human factors, user interface design, psychology, and information sciences. About half work in academic positions and half in industry. About a third of the participants have a primary focus on eye movement research and about a third have a primary focus on usability testing. One third had already incorporated eye tracking in usability testing at the time of the workshop. Half have developed their own analysis routines. About half use a video based eye tracker. Most (19 participants) study fixation duration and scan paths when analyzing eye movement data.

Here is a list of those who participated in addition to the authors:

· Antti Aaltonen, University of Tampere, Finland

· Mike Byrne, Carnegie Mellon University, USA

· Martha Crosby, University of Hawaii, USA

· Myron Flickner, IBM Corporation, USA

· Joe Goldberg, Pennsylvania State University, USA

· Michael Grace-Martin, University of California at Santa Barbara, USA

· Wayne Gray, George Mason University, USA

· Brooke Hallowell, Ohio University, USA

· Xerxes Kotval, Lucent Technologies, USA

· Joe Lahoud, Applied Sciences Corporation, USA

· Chris Lankford, ERICA Corporation, USA

· N.Hari Narayanan, Auburn University, USA

· Kirsten Risden, Microsoft Corporation, USA

· Dario Salvucci, Carnegie Mellon University, USA

· Tom Tullis, Fidelity Investments, USA

· Bill Weiland, CHI Systems Corporation, USA

· Richard Young, University of Hertfordshire, UK

· Daniela Zambarbieri, Università di Pavia, Italy.

Workshop Discussion Topics

The workshop focused on three main topics:

1. Incorporating eye position data collection in usability tests

· What types of usability questions are best addressed by eye movement data?

· How does eye position data fit with traditional usability data (e.g., user’s verbal comments, time on task)?

· What eye position measures are most useful (e.g., fixation location / duration / sequences, total dwell time on a region, gaze trails, pupil diameter, blink rate)?

· How much do we need to alter our usability labs and subject pool in order to collect eye movement data?

2. Interpreting eye position data

· How to process raw data? Definitions of fixation, saccade, etc. Commercial and custom software for processing data.

· Do measures correspond to cognitive processes (does a long fixation reflect complexity, ambiguity, incomprehension)?

· Will we be able to interpret data using theories of other disciplines that use eye tracking (e.g., attention, language comprehension) or do we need a separate theory of usability?

3. Resolving technical issues with the eye tracking technology.

· Which eye tracking technology is currently best suited to data collection in the usability lab?

· What type of calibration process is needed (how long does it take, how frequently is it necessary to re-calibrate, what factors affect calibration)?

· How is the subject constrained (e.g., head / body movement, mouse and keyboard use)?

· Developing a requirements list for future eye trackers

Workshop Process

We organized the participants into three teams, one for each of the topics. Pre-workshop discussion occurred via e-mail, provoked by distribution of participants’ position papers.

At the workshop one member from each team gave a summary of the topic to the entire group. Then the teams processed these topics separately. The entire group of participants then reconvened to discuss all three topics.

Incorporating eye tracking in usability tests

The group that focused on incorporating eye tracking in usability tests generated this list of reasons for using eye tracking during usability tests.

· Support other types of data

· Help discriminate “dead time”

· Measure how long a user looked at an area of interest

· Capture a sequential scan path

· Evaluate a specific interface

· Extract general design principles

· Demonstrate scanning efficiency

· Understand expert performance for training

· Help to sell usability testing

· Provide a quantitative comparison of UI designs.

· Provide domain specific benefits (web pages, cockpits, text design)

· Help explain individual differences.

All group members agreed that it is worth the time and effort to collect eye tracker data when the domain is well understood.

Interpreting eye position data

The group that focused on interpretation of eye movement data came up with these top three representations of eye movement patterns acquired during usability testing:

1. Ability to play back scan paths.

2. Time spent in areas of interest (as specified by the evaluator).

3. Transitions between areas of interest (i.e., transition probability matrix).

Note that these representations of the data are considered "third order" based on the following hierarchy.

First Order Data

Raw (unfiltered) data. These are direct eye tracker outputs (e.g., X / Y position data, pupil diameter, blink signal). Note that some eye trackers output processed data.

All data should be reliably time stamped and paired in order to allow synchronization with other data streams. Be aware that smoothing data may introduce latencies.

Second Order Data

These data are derived from first order data, and their exact definitions vary from study to study due to differences in eye tracking hardware constraints, analysis techniques, and researcher opinion. Second order data include:

· Fixations,

· Pursuit eye movements and

· Saccades.

Third Order Data

Third order data are derived from second order data and include:

· Scan paths (a playback feature is useful to represent the temporal sequence of fixations),

· Total fixation time within areas of interest and

· Matrix of transition probabilities between areas of interest.

Fourth Order Data

· Scan path shape,

· Scan path complexity and

· Scan path variability.

Ideal Eye Tracking System for Usability

The group discussing technical issues with the eye tracking technology came up with the following list of attributes for an ideal eye tracking system for usability testing. Note that this is a wish list that we hope eye tracker manufacturers will consider in future designs.

Easy-to-use

· quick set-up, calibration

· simple, rapid collection and analysis of data

· real-time view of scan path

Unobtrusive

· no contact with subject

· unencumbered motion

· non-contact head tracking

· track multiple users simultaneously

Accurate: resolution of < 0.5 degree

Fast: sampling rate ³60 Hz

Robust: able to track any participant

Small and low-cost

Compatible with commercial off-the-shelf software

Fully integrated / synchronized data streams

· eye gaze data, display state, mouse/keyboard

· no latencies among inputs

· time stamp the data at the source.

Open Issues

The following issues were suggested by the group to be the most significant unresolved issues in the area of incorporating eye tracking in usability testing:

· How to handle excessive volume of data

· Correspondence of eye position and deployment of attention

· Integration of multiple cameras

· Fuse eye, mouse, facial expression, voice input, other data

· Standardize definitions of derived data (e.g., fixations)

· Taxonomy of dependent measures & applications

· Reduce complexity of equipment use & data analysis

· Techniques to deal with spurious data.

Next steps

Usability specialists and researchers around the world continue to create and refine processes for using eye tracking to learn more about human-machine interaction. A follow-up meeting is planned November 6-8, 2000 for further discussion. For more information see http://www.vr.clemson.edu/eyetracking/et-conf/.

About the Authors

Keith S. Karn has 20 years experience in research and product design focused on human machine interaction. He holds advanced degrees in psychology and engineering and has contributed to the development of diverse products including fighter aircraft cockpits, inkjet printers, networked scanners and medical imaging systems. He has used eye movement recording to study short-term spatial memory in the context of visual motor tasks. Currently he is a usability strategist for Xerox Corporation and an adjunct assistant professor at the University of Rochester.

Steve Ellis is a Consulting Member of Technical Staff in the User Experience Department at Avaya Communication, formerly the Enterprise Networks Division of Lucent Technologies. Steve received his Ph.D. in experimental psychology from Carnegie-Mellon University, where he also taught and conducted human performance research. Steve has worked on and managed teams responsible for user interface design and usability testing for a wide variety of communications systems and services. His current work includes using an eye tracking system to help design e-business portal applications.

Cornell Juliano. As a usability specialist at Xerox Corporation, Cornell has combined his interests in machine and human behavior. Cornell studied machine behavior for several years as a mechanic and electrician at Kodak where he built and maintained high-volume production machines. He later studied human behavior at the University of Rochester where he received a BA in Cognitive Science, an MA in Experimental Psychology, and a Ph.D. from the Department of Brain and Cognitive Sciences. He has explored the use of eye tracking as a tool to understand reading comprehension. Currently, Cornell examines ease of use of hardware and software products for document production and management.

Author’s Address

Keith S. Karn
Xerox Corporation
Mail Stop: 0801-10C
Rochester, NY 14623 USA
email: Keith.karn@usa.xerox.com
Tel: 716-427-4549

Steve Ellis
Avaya Communication
101 Crawfords Corner Road
Holmdel, New Jersey USA
email: shellis@avaya.com
Tel: 732-817-3303

Cornell Juliano
Xerox Corporation
Mail Stop: 0801-10C
Rochester, NY 14623 USA
email: cornell.juliano@usa.xerox.com
Tel: 716-427-5451


 

Powered by Plone CMS, the Open Source Content Management System

This site conforms to the following standards: