Abstract

Existing eye-gaze-tracking systems typically require multiple infrared (IR) lights and high-quality cameras to achieve good performance and robustness against head movement. This requirement limits the systems’ potential for broader applications. In this paper, we present a low-cost, non-intrusive, simple-setup gaze estimation system that can estimate the gaze direction under free head movement. In particular, the proposed system only uses a consumer depth camera (Kinect sensor) positioned at a distance from the subject. We develop a simple procedure to calibrate the geometric relationship between the screen and the camera, and subject-specific parameters. A parameterized iris model is then used to locate the center of the iris for gaze feature extraction, which can handle low-quality eye images. Finally, the gaze direction is determined based on a 3D geometric eye model, where the head movement and deviation of the visual axis from the optical axis are taken into consideration. Experimental results indicate that the system can estimate gaze with an accuracy of 1.4–2.7° and is robust against large head movements. Two real-time human–computer interaction (HCI) applications are presented to demonstrate the potential of the proposed system for wide applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call