Abstract

BackgroundThe complexity of analyzing eye-tracking signals increases as eye-trackers become more mobile. The signals from a mobile eye-tracker are recorded in relation to the head coordinate system and when the head and body move, the recorded eye-tracking signal is influenced by these movements, which render the subsequent event detection difficult. New methodThe purpose of the present paper is to develop a method that performs robust event detection in signals recorded using a mobile eye-tracker. The proposed method performs compensation of head movements recorded using an inertial measurement unit and employs a multi-modal event detection algorithm. The event detection algorithm is based on the head compensated eye-tracking signal combined with information about detected objects extracted from the scene camera of the mobile eye-tracker. ResultsThe method is evaluated when participants are seated 2.6m in front of a big screen, and is therefore only valid for distant targets. The proposed method for head compensation decreases the standard deviation during intervals of fixations from 8° to 3.3° for eye-tracking signals recorded during large head movements. Comparison with existing methodsThe multi-modal event detection algorithm outperforms both an existing algorithm (I-VDT) and the built-in-algorithm of the mobile eye-tracker with an average balanced accuracy, calculated over all types of eye movements, of 0.90, compared to 0.85 and 0.75, respectively for the compared algorithms. ConclusionsThe proposed event detector that combines head movement compensation and information regarding detected objects in the scene video enables for improved classification of events in mobile eye-tracking data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.