At EyeQuant, we do a lot of eye-tracking as part of our mission to teach computers to see the web like humans do. The main purpose of our studies is to find the statistical patterns that power our attention models (which you can use to instantly test your websites!) Today, we’re sharing 3 of the most surprising insights we found.
A lot of you have asked us about general rules of thumb around what drives (and doesn’t drive) attention – in this post you’ll learn why rules of thumb are difficult to establish and how a lot of the common ideas we have about human attention are more complicated than they seem. In fact, what you’re about to read is going to be rather surprising and we’re hoping to dispel some common myths about attention and web design with data.
METHOD: We’re looking at data from one of our recent eye-tracking studies with 46 subjects who were purchasing products on 200 AdWords eCommerce pages. We recorded 261,150 fixations in total and users we looking at each webpage for 15 sec (+/- 6 sec) on average. The study was conducted in the Neurobiopsychology Lab at the University of Osnabrueck, Germany.
DISCLAIMER: Since the purpose of this study was to further expand EyeQuant’s predictive capacities, we’re also providing EyeQuant’s results for comparison next to the empirical data – please note that these predictions are based on a new EyeQuant model that’s currently in early testing, but are already quite close to the real thing (currently this model provides over 75% predictive accuracy (AUC, warning: math), whereas our standard model achieves over 90%).