Abstract

Despite the fact that the velocity threshold method is widely applied, the detection of microsaccades continues to be a challenging problem, due to gaze-tracking inaccuracy and the transient nature of microsaccades. Important parameters associated with a saccadic event, e.g., saccade duration, amplitude, and maximum velocity, are sometimes imprecisely estimated, which may lead to biases in inferring the roles of microsaccades in perception and cognition. To overcome the biases and have a better detection algorithm for microsaccades, we propose a novel statistical model for the tracked gaze positions during eye fixations. In this model, we incorporate a parametrization that has been previously applied to model saccades, which allows us to veridically capture the velocity profile of saccadic eye movements. Based on our model, we derive the Neyman Pearson Detector (NPD) for saccadic events. Implemented in conjunction with the maximum likelihood estimation method, our NPD can detect a saccadic event and estimate all parameters simultaneously. Because of its adaptive nature and its statistical optimality, our NPD method was able to better detect microsaccades in some datasets when compared with a recently proposed state-of-the-art method based on convolutional neural networks. NPD also yielded comparable performance with a recently developed Bayesian algorithm, with the added benefit of modeling a more biologically veridical velocity profile of the saccade. As opposed to these algorithms, NPD can lend itself better to online saccade detection, and thus has potential for human-computer interaction applications. Our algorithm is publicly available at https://github.com/hz-zhu/NPD-micro-saccade-detection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.