Abstract

With the requirements for improving life quality, companion robots have gradually become hotspot of application for healthy home living. In this article, a novel bionic human‐robot interaction (HRI) strategy using stereo vision algorithms has been developed to imitate the animal vision system on the Owl robot. Depth information of a target is found via two methods, vergence and disparity. Vergence requires physical tracking of the target, moving each camera to align with a chosen object, and through successive camera movements (saccades) a sparse depth map of the scene can be built up. Disparity however requires the cameras to be fixed and parallel, using the position of the target within the field of view, of a stereo pair of cameras, to calculate distance. As disparity does not require the cameras to move, multiple targets can be chosen to build up a disparity map, providing depth information for the whole scene. In addition, a salience model is implemented imitating how people explore a scene. This is achieved with feature maps, which apply filtering to the scene to highlight areas of interest, for example color and edges, which is purely a bottom‐up approach based on Itti and Koch's saliency model. A series of experiments have been conducted on Plymouth Owl robot to validate the proposed interface.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call