Last Thursday I went to the Waterloo User Experience Group monthly meeting on Eye-Tracking technology used in usability practice. It was my first time observing eye-tracking technology in action – quite interesting! The hardware setup for eye-tracking was non-obtrusive and much more elegant than I imagined: no cameras in your face nor a head mounted display. The system semi-automatically calibrates to a participate’s eyes, and it’s ready to go. The software (e.g. Tobii Studio) that comes with the eye-tracking is quite advanced. It provides different visualizations of the eye-tracking data such as heat maps and gazeplot visualizations, video playback, mouse click detections, qualitative analysis and much more to help people who conduct the test to use the data more effectively. However, there are some fundamental questions we need to ask before relying on these rich data.
Seeing vs. Understanding. Although eye-tracking accurately records what your eyes are looking at on an interface, it can’t detect whether you understand what you are looking at. A couple of UX practitioners at the meeting mentioned from their experience that subjects may stare at element A on the interface during the usability test and later didn’t recall they even looked at A. On the other hand, professor, Blair Nonnecke, from the University of Guelph pointed out that we rely a lot on our peripheral vision, so we may not need to look at something in order to confirm it’s there. For example, it’s enough to use our peripheral vision to realize there’s a big tag cloud on the right of this blog. Because of this, we can’t really say eye-tracking is “visualizing the human mind”
How to use eye-tracking? This was the most discussed topic at the meeting. Almost everyone agreed that eye-tracking cannot be used alone as a usability test method but should be complementary to the traditional usability test methods. For example, a participate mentioned it was hard for her to find “Sign In” function on the homepage. Eye-tracking does an excellent job of recording where exactly the user looked before finally finding the “Sign In” feature. This could give designers great insights on where does the user expect the “Sign In function to be.
A good resource to read on eye-tracking research, please see Jakob Nielsen’s article on F-Shaped Pattern For Reading Web Content.