Abstract
Gaze-following is an effective way for intention understanding in human–robot interaction, which aims to follow the gaze of humans to estimate what object is being observed. Most of the existing methods require people and objects to appear in the same image. Due to the limitation in the view of the camera, these methods are not applicable in practice. To address this problem, we propose a method of gaze following that utilizes a geometric map for better estimation. With the help of the map, this method is competitive for cross-frame estimation. On the basis of this method, we propose a novel gaze-based image caption system, which has been studied for the first time. Our experiments demonstrate that the system follows the gaze and describes objects accurately. We believe that this system is competent for autistic children’s rehabilitation training, pension service robots, and other applications.
Highlights
Humans are very good at understanding the intentions of others by following the gaze
If robots have the capability of gaze-following, they would be competent for many human–robot interaction tasks, including helping doctors with rehabilitation training for autism [1,2]
(3) For the first time, we studied the problem of describing the region where people are looking at and combine image caption with gaze-following
Summary
Humans are very good at understanding the intentions of others by following the gaze. We can find crucial clues through the suspect’s attention at the scene of the crime. This ability leads us to obtain obscure but essential information. If robots have the capability of gaze-following, they would be competent for many human–robot interaction tasks, including helping doctors with rehabilitation training for autism [1,2]. This is the goal that we set to achieve. They are usually interested in some abnormal objects, such as bottle caps or door handles, instead of toys that non-autistic children like
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.