Abstract

Aiming at reducing the restrictions due to person/scene dependence, we deliver a novel method that solves appearance-based gaze estimation in a novel fashion. First, we introduce and solve an "uncalibrated gaze pattern" solely from eye images independent of the person and scene. The gaze pattern recovers gaze movements up to only scaling and translation ambiguities, via nonlinear dimension reduction and pixel motion analysis, while no training/calibration is needed. This is new in the literature and enables novel applications. Second, our method allows simple calibrations to align the gaze pattern to any gaze target. This is much simpler than conventional calibrations which rely on sufficient training data to compute person and scene-specific nonlinear gaze mappings. Through various evaluations, we show that: 1) the proposed uncalibrated gaze pattern has novel and broad capabilities; 2) the proposed calibration is simple and efficient, and can be even omitted in some scenarios; and 3) quantitative evaluations produce promising results under various conditions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.