Objective:This work addresses an impractical and unnatural constraint that has been generally enforced in state-of-the-art electrooculography (EOG)-based gaze estimation methods, that of maintaining a stationary head pose and position. Specifically, this work proposes an EOG-based gaze angle (GA) estimation method that accommodates natural variations in the user’s head pose and position. The EOG data collected under non-stationary head conditions, which was used in this work to validate the proposed method, is also being made publicly available. Methods:This work generalises a two-eye verging gaze geometrical model to cater for arbitrary head poses and positions, and also models the dynamics of the vestibulo-ocular reflex (VOR), which refers to the eye-head coordination that normally takes place during gaze shifts under unrestrained head conditions. These methods are validated by incorporating them within a published multiple-model framework for GA estimation. Results:When applied to short EOG data segments, a horizontal and vertical GA estimation error of 1.85 ± 0.51° and 2.19 ± 0.62°, respectively, and an eye movement detection and labelling F-score close to 90% were obtained. These results are comparable to those reported previously under stationary head conditions. Conclusion:This work demonstrates that accurate GA estimation and eye movement detection and labelling can be achieved using EOG signals, even when the user’s head is not stationary. Significance:This work eliminates the need for users to maintain a stationary head pose and position, a common constraint in the field, thus introducing an EOG-based GA estimation method that allows users to move their heads naturally.
Read full abstract