Abstract

Accurate estimation of eye-related information is important for many applications such as gaze estimation, face alignment, driver drowsiness detection, etc. Earlier works fail to estimate eye information in low-resolution images captured by a regular camera or webcam. This paper is aimed at developing an Iris Center (IC) and Eye Corner (EC) localization method in low-resolution facial images with an application of gaze estimation. A three-stage method is proposed for IC and EC localization. In the first stage, a circular gradient-intensity-based operator is proposed for rough ICs estimation and a CNN model is used in the second stage to find true ICs. In the third stage, Explicit Shape Regression (ESR) method is used for EC localization where initialization is done taking the ICs as a reference point to the mean eye contour shape model. The proposed IC localization method is evaluated on BioID and Gi4E database and it shows better accuracy compare to some of the state-of-the-art methods. This method further evaluated for gaze estimation based on IC and EC which does not require any prior calibrations unlike earlier infrared illumination-based gaze trackers. Here, the experiment for gaze estimation is performed in our proposed NITSGoP database that prepared under indoor conditions with complex background and uneven illuminations. The experimental results suggest that the proposed method can be used for gaze estimation with better accuracy both in still images and videos.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.