Abstract
Eye tracking technology can reflect human attention and cognition, widely used as a research tool. To analyze eye movement data, users need to determine a specific area known as areas of interests (AOIs). Although existing tools offer dynamic AOIs functions to process visual behavior on moving stimuli, they may ask users to use markers to specify contours of moving stimuli in the physical environment or define AOIs manually on screen. This paper proposes a toolbox named Eyebox to 1) recognize dynamic AOIs automatically based on SIFT and extract eye movement indicators, as well as 2) draw fixations. We also design a user-friendly interface for this toolbox. Eyebox currently supports processing data recorded from the Pupil Core device. We compared results processed by manual with by Eyebox in a custom eye-tracking dataset to evaluate this toolbox. The accuracy of 3/4 data for AOI1 is above 90%, and the accuracy of 4/5 data for AOI2 is higher than 90%. Finally, we conducted a user study to test the usability and user experience of EyeBox.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.