Abstract

Recently, many kinds of robots are developed, and there are a lot of robots which work in human living space. One of the most important interactions between a robot and human is when a human informs a robot of an object’s location. The purpose of this work is to make an interface for informing a robot of object location in a human living space with several objects. We assume that the robot has found a user by sound source localization. At the beginning, the robot recognizes pointing gesture and verbal cues of the user, and detects candidates of object location. The system recognizes pointing direction by a stereo camera, and recognizes verbal cues. The direction of the pointing gesture and the directive word are used to restrict the searching space. When multiple object candidates are detected, the system asks the user for additional features such as color name or relative location among those, and then finds one of them. We have conducted experiments on a dialog task. There were three objects in the searching space. The system is able specify the object by dialog, after which, the robot moves toward it.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.