Abstract

Eye tracking is a widely used technique. To enhance eye gaze estimation in different contexts, many eye tracking datasets have been proposed. However, these datasets depend on calibrations in data capture and its applications. We seek to construct a dataset that enables the design of a calibration-free eye tracking device irrespective of users and scenes. To reach this goal, we present ARGaze, a dataset with 1,321,968 pairs of eye gaze images at 32 × 32 pixel resolution and 50 corresponding videos of world views based on a replicable augmented reality headset. The dataset was captured from 25 participants who completed eye gaze tasks for 30 min in both real-world and augmented reality scenes. To validate the dataset, we compared it against state-of-the-art eye gaze datasets in terms of effectiveness and accuracy and report that the ARGaze dataset achieved record low gaze estimation error by 3.70 degrees on average and 1.56 degrees on specific participants without calibrations to the two scenes. Implications for generalising the use of the dataset are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call