Abstract

Video-based eye tracking typically relies on tracking the pupil and a first-surface corneal reflection (CR) of an illumination source. The positional difference between these two features is used to determine the observer's eye-in-head orientation. With the use of head-mounted eye trackers, this positional difference is unavoidably affected by relative movements between the eye tracking camera and the subject's eye. Video-based trackers also suffer from problems resulting in poor CR detection, such as spurious reflections being mistaken as the desired CR. We approach these problems by modelling how these features—pupil and CR—are affected by different relative movements of the eye. Optical relationships between the offset of the apparent pupil centre and that of the CR are derived. An experiment was conducted with five observers to support these derivations. Solutions to the aforementioned problems using these offset relationships are provided. The first application compensates for movements of an eye tracking camera with respect to the eye and reduces noise in the final eye orientation data. The second application is prediction of CR locations for artefact removal in the new eye tracking system prototype.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call