Abstract

Eye tracking is an advancing technology holding significant promise to improve our understanding of human behavior and decision making. Gaze data gathered by eye trackers contain events known as fixations. Fixations indicate visual attention and awareness, and are identified by algorithms that parse eye-tracking data into a sequence of gaze point clusters. While great potential exists, eye-tracker imprecision often results in noisy gaze data, such as what arises from calibration errors, erratic eye movements, or other system noise. Noise can cause inaccurate identification of fixations in eye-tracking applications, resulting in misleading behavioral interpretations and conclusions. Therefore, fixation identification algorithms should be robust against data noise. To resolve such inaccuracies, we propose FID+: outlier-aware fixation identification via fixation inner-density. We represent the problem of detecting outliers in fixation gaze data through a novel mixed-integer optimization formulation, and subsequently strengthen the formulation using two geometric arguments to provide enhanced bounds. We show that neither bound dominates the other, and that both are effective in reducing the overall solution runtime. Our experiments on real gaze recordings demonstrate that accommodating for the reality of fixation outliers enhances the ability to identify fixations with greater density in reasonable runtime.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call