Abstract

We introduce a conceptually novel method for eye-movement signal analysis. The method is general in that it does not place severe restrictions on sampling frequency, measurement noise or subject behavior. Event identification is based on segmentation that simultaneously denoises the signal and determines event boundaries. The full gaze position time-series is segmented into an approximately optimal piecewise linear function in O(n) time. Gaze feature parameters for classification into fixations, saccades, smooth pursuits and post-saccadic oscillations are derived from human labeling in a data-driven manner. The range of oculomotor events identified and the powerful denoising performance make the method useable for both low-noise controlled laboratory settings and high-noise complex field experiments. This is desirable for harmonizing the gaze behavior (in the wild) and oculomotor event identification (in the laboratory) approaches to eye movement behavior. Denoising and classification performance are assessed using multiple datasets. Full open source implementation is included.

Highlights

  • Humans and other foveate animals – such as monkeys and birds of prey – visually scan scenes with a characteristic fixate-saccade-fixate pattern: periods of relative stability are interspersed with rapid shifts of gaze

  • We introduce Naive Segmented Linear Regression (NSLR), a new method for eye-movement signal denoising and segmentation, and a related event classification method based on Hidden Markov Models (NSLR-HMM)

  • We have introduced here a method for eye-movement signal analysis, NSLR-HMM, that has some novel and desirable features compared to the state of the art

Read more

Summary

Introduction

Humans and other foveate animals – such as monkeys and birds of prey – visually scan scenes with a characteristic fixate-saccade-fixate pattern: periods of relative stability are interspersed with rapid shifts of gaze. Much of what we know about oculomotor control circuits is based on such laboratory experiments where the participant’s head is fixed with a chin rest or a bite bar, and the stimulus and task are restricted so as to elicit only a specific eye movement type. Sampling frequencies may range from 500 to as high as 2000 Hz. As the subject’s behavior is restricted, it is possible to tailor custom event identification methods that rely on only the eye movement type of interest being present in the data (and would produce spurious results with data from free eye movement behavior). Mobile measuring equipment has much lower accuracy and relatively high levels of noise, with sampling frequency typically between 30 and 120 Hz. The subject’s behavior is complex, calling for robust event identification that works when www.nature.com/scientificreports/. For wider generalizability of results, it would be desirable to analyze eye movement events in a similar way across task settings, by using event detection methods that do not rely on restrictions or assumptions which are not valid for most natural behavior

Methods
Findings
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call