Abstract

Microsaccades (or small saccades) are fixational eye movements with high velocity. They have been proposed as an index for covert spatial attention, but this proposal has been contested. One reason why it has been difficult to reach consensus is that different studies use different, arbitrary detection criteria such as velocity thresholds. Here, we developed a principled method for identifying microsaccades, based on Bayesian changepoint detection. Our generative model contains a latent state variable that changes between drift/tremor and microsaccade states at random times. We model the eye position as a biased random walk with a different velocity distribution for each state; on average, microsaccades have higher speed. Using this generative model, we computed the posterior probability over the time series of the state variable given the entire eye position time series. To sample from this high-dimensional posterior while avoiding local maxima, we used parallel-tempered MCMC. To test the validity of our algorithm, we applied it to simulated eye position data from the generative model. At low noise levels, we recovered the true microsaccades near perfectly, while at higher noise levels, we found state vectors with higher posterior probabilities than the true time series. When we apply the algorithm to real data, the inferred microsaccades are comparable with those found by previous methods. Our approach has advantages over previous methods: (1) the detection criterion is derived, not assumed, (2) we obtain a probabilistic judgment (i.e. a confidence rating), instead of a binary one, (3) the method can be straightforwardly adapted as the generative model is refined. Meeting abstract presented at VSS 2015.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.