Abstract

We propose a real-time gaze estimation method based on facial-feature tracking using a single video camera.In ourq method,gaze directions are determined as 3D vectors connecting both the eyeball and iris centers.Since the center of eyeball cannot be directly observed from images,the geometrical relationship between the eyeball centers and the facial features and the radius of the eyeball(face model)are calculated in advance(calibration process).The 2D positions of the eyeball centers can be estimated by using the face model and facial feature positions.Gaze direction can then be determined by tracking the facial features.In the calibration process,we employ an image sequence(more than three frames)where a subject moves his/her head while keeping his/her gaze on the camera location.In such a situation,since the camera,iris centers and eyeball centers lie in a straight line,the eyeball centers can be observed as the position of the iris-center.Using data from these observations enables us to easily obtain the relations between the eyeball centers and facial features.Experimental results show that the gaze estimation accuracy of the proposed method is 4°horizontally and 7°vertically.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call