Abstract
Objective:This work addresses an impractical and unnatural constraint that has been generally enforced in state-of-the-art electrooculography (EOG)-based gaze estimation methods, that of maintaining a stationary head pose and position. Specifically, this work proposes an EOG-based gaze angle (GA) estimation method that accommodates natural variations in the user’s head pose and position. The EOG data collected under non-stationary head conditions, which was used in this work to validate the proposed method, is also being made publicly available. Methods:This work generalises a two-eye verging gaze geometrical model to cater for arbitrary head poses and positions, and also models the dynamics of the vestibulo-ocular reflex (VOR), which refers to the eye-head coordination that normally takes place during gaze shifts under unrestrained head conditions. These methods are validated by incorporating them within a published multiple-model framework for GA estimation. Results:When applied to short EOG data segments, a horizontal and vertical GA estimation error of 1.85 ± 0.51° and 2.19 ± 0.62°, respectively, and an eye movement detection and labelling F-score close to 90% were obtained. These results are comparable to those reported previously under stationary head conditions. Conclusion:This work demonstrates that accurate GA estimation and eye movement detection and labelling can be achieved using EOG signals, even when the user’s head is not stationary. Significance:This work eliminates the need for users to maintain a stationary head pose and position, a common constraint in the field, thus introducing an EOG-based GA estimation method that allows users to move their heads naturally.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.