Automatic eye tracking is of interest for interaction with people suffering from amyotrophic lateral sclerosis, for using the eyes to control a computer mouse, and for controlled radiotherapy of uveal melanoma. It has been speculated that gaze estimation accuracy might be improved by using the vestibulo-ocular reflex. This involuntary reflex results in slow, compensatory eye movements, opposing the direction of head motion. We therefore hypothesised that leaving the head to move freely during eye tracking must produce more accurate results than keeping the head fixed, only allowing the eyes to move. The purpose of this study was to create a low-cost eye tracking system that incorporates the vestibulo-ocular reflex in gaze estimation, by keeping the head freely moving. The instrument used comprised a low-cost head-mounted webcam which recorded a single eye. Pupil detection was fully automatic and in real time with a straightforward hybrid colour-based and model-based algorithm, despite the lower-end webcam used for recording and despite the absence of direct illumination. A model-based algorithm and an interpolation-based algorithm were tested in this study. Based on mean absolute angle difference in the gaze estimation results, we conclude that the model-based algorithm performed better when the head was not moving and equally well when the head was moving. With most deviations of the points of gaze from the target points being less than 1° using either algorithm when the head is moving freely, it can be concluded that our setup performs fully within the 2° benchmark from literature, whereas deviations when the head was not moving exceeded 2°. The algorithms used were not previously tested under passive illumination. This was the first study of a low-cost eye-tracking setup taking into account the vestibulo-ocular reflex.
Read full abstract