Abstract

Most conventional gaze-tracking systems require that users look at many points during the initial calibration stage, which is inconvenient for them. To avoid this requirement, we propose a new gaze-tracking method with four important characteristics. First, our gaze-tracking system uses a large screen located at a distance from the user, who wears a lightweight device. Second, our system requires that users look at only four calibration points during the initial calibration stage, during which four pupil centers are noted. Third, five additional points (virtual pupil centers) are generated with a multilayer perceptron using the four actual points (detected pupil centers) as inputs. Fourth, when a user gazes at a large screen, the shape defined by the positions of the four pupil centers is a distorted quadrangle because of the nonlinear movement of the human eyeball. The gaze-detection accuracy is reduced if we map the pupil movement area onto the screen area using a single transform function. We overcame this problem by calculating the gaze position based on multi-geometric transforms using the five virtual points and the four actual points. Experiment results show that the accuracy of the proposed method is better than that of other methods.

Highlights

  • Gaze-tracking technology is used to detect a user’s gaze position in many applications, such as computer interfaces for the disabled, medical care, rehabilitation, and virtual reality [1,2,3]

  • To minimize the number of calibration points while maintaining the accuracy of gaze tracking, we propose a new gaze-tracking method based on the generation of virtual calibration points

  • We proposed a new gaze-tracking method to improve the performance of a gaze-tracking system using a large screen at a distance

Read more

Summary

Introduction

Gaze-tracking technology is used to detect a user’s gaze position in many applications, such as computer interfaces for the disabled, medical care, rehabilitation, and virtual reality [1,2,3]. The wearable type requires a user to wear a device that includes a camera and a near-infrared (NIR) light illuminator. When calculating the gaze position on a screen, tracking the head movements requires additional NIR illuminators in the four corners of the screen or an additional camera [4,5,6]. With the remote-type method, the user does not need to wear a device, because a remote camera captures an image of the user’s eye, which is more convenient for the user [7,8]. The user needs to gaze at reference positions on a screen. To minimize the user inconvenience, NIR illuminators are attached to the four corners of the monitor and the system requires that users view only one position during Kappa calibration [5,6,13].

Method
Overview of the Proposed Method
Proposed Gaze-Tracking Device
Detecting the Pupil Center
User Calibration by Gazing at the Four Corners of a Screen
Generating Five Virtual Points Using the MLP Algorithm
Calculating Final Gaze Position using Multi-geometric Transforms
Experimental Results
Proposed Method
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call