Abstract

The recognition of gaze as for example mutual gaze plays an important role in social interaction. Previous research shows that already infants are capable of detecting mutual gaze. Such abilities are relevant for robots to learn from interaction, for example detecting when the robot is being addressed. Although various gaze tracking methods have been proposed, few seem to be openly available for robotic platforms such as iCub. In this paper we will describe a gaze tracking system for humanoid robots that is completely based on freely available libraries and data sets. Our system is able to estimate horizontal and vertical gaze directions using low resolution VGA images from robot embodied vision at 30 frames per second. For this purpose we developed a pupil detection algorithm combining existing approaches to increase noise robustness. Our method combines positions of the face and eye features as well as context features such as eyelid correlates and thus does not rely on fixed head orientations. An evaluation on the iCub robot shows that our method is able to estimate mutual gaze with 96% accuracy at 8° tolerance and one meter distance to the robot. The results further support that mutual gaze detection yields higher accuracy in an embodied setup compared to other configurations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.