Abstract

Eye movement plays an important role in cognition and perception, and the detection of saccade and fixation has been studied for human-computer applications, however often under conditions where head movement is constrained, and often using calibration-dependent gaze information rather than the raw pupil position. In order to investigate the performance of saccade and fixation detection using gaze and pupil data, three representative saccade detection algorithms are applied to both pupil data and gaze data collected with and without head movement, and their performance is evaluated against a stimulus-induced ground truth under different measures. Results indicate that saccade/fixation detection using pupil data generally provides better performance than using gaze data with an 8.6% improvement in Cohen’s Kappa (averaged across the three algorithms), even when moderate head movement is involved. Hence, pupil data can be used as an alternative to gaze data for saccade and fixation detection in wearable contexts with less effort in calibration and higher accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.