Abstract

In this work, we propose a method for estimating the user's gazing point in the environment using images taken by an eye tracker and an omnidirectional camera. The proposed method estimates the eye positon in environment by mapping the gazing point obtained by the eye tracker in the omnidirectional camera image. However, matching the omnidirectional image and the eye tracker image is difficult because the omnidirectional image is distorted by equirectangular projection. Therefore, we propose a method for estimating eye location in the omnidirectional image by matching the eye tracker image to the omnidirectional image with considering the distortion. Specifically, this method repeats image matching and image conversion using the matching results.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call