Abstract

The localization of eye centers is a very useful cue for numerous applications like face recognition, facial expression recognition, and the early screening of neurological pathologies. Several methods relying on available light for accurate eye-center localization have been exploited. However, despite the considerable improvements that eye-center localization systems have undergone in recent years, only few of these developments deal with the challenges posed by the profile (non-frontal face). In this paper, we first use the explicit shape regression method to obtain the rough location of the eye centers. Because this method extracts global information from the human face, it is robust against any changes in the eye region. We exploit this robustness and utilize it as a constraint. To locate the eye centers accurately, we employ isophote curvature features, the accuracy of which has been demonstrated in a previous study. By applying these features, we obtain a series of eye-center locations which are candidates for the actual position of the eye-center. Among these locations, the estimated locations which minimize the reconstruction error between the two methods mentioned above are taken as the closest approximation for the eye centers locations. Therefore, we combine explicit shape regression and isophote curvature feature analysis to achieve robustness and accuracy, respectively. In practical experiments, we use BioID and FERET datasets to test our approach to obtaining an accurate eye-center location while retaining robustness against changes in scale and pose. In addition, we apply our method to non-frontal faces to test its robustness and accuracy, which are essential in gaze estimation but have seldom been mentioned in previous works. Through extensive experimentation, we show that the proposed method can achieve a significant improvement in accuracy and robustness over state-of-the-art techniques, with our method ranking second in terms of accuracy. According to our implementation on a PC with a Xeon 2.5Ghz CPU, the frame rate of the eye tracking process can achieve 38 Hz.

Highlights

  • Human eye localization plays an important role in estimating the focus location, as regards human computer interaction (HCI)

  • Eye localization systems can be broadly classified into two categories: active eyelocalization systems (AELS) and passive eye localization systems (PELS)

  • In order to investigate the subject performance of our face alignment method and the invariant isocentric pattern (FAIIP) eye-center localization approach, we compared our model with the original maximum isocenter (MIC) method [18]

Read more

Summary

Introduction

Human eye localization plays an important role in estimating the focus location, as regards human computer interaction (HCI). The accuracy and robustness of iris-center (IC) localization significantly affects gaze tracking performance. The AELS have improved significantly, and are beginning to play an important role in the HCI field. Unlike the AELS, which uses the image coordinates of the pupil and corneal reflection [5–9], the PELS attempts to directly obtain information about the IC location based only on the images supplied from a camera video stream only [10–14]. Due to the increased speed of computer processors, improved computer vision techniques, and the advent of high performance digital cameras, the most desired type of IC localization output, which estimates the (x,y) coordinates of the user’s gaze (i.e., it directly maps the IC location to a target plane such as the monitor screen), can be implemented. We discuss approaches to improve the robustness and accuracy of this technology in detail

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.