Abstract

In this paper, we develop a robotic telepresence system to provide remote users with immersive embodiment in local environments through a custom-designed mobile robot. The proposed telepresence system uses a virtual reality (VR) device to connect a remote user to the robot. Three dimensional visual data from a RGB-D camera are rendered for real-time stereoscopic display in the VR device, which forms a deeply-coupled human machine system and creates an immersive experience of telepresence. Based on a user study, it is found that better user experience can be achieved by allowing the robot to track the speaker while being aware of the intention of the remote user. To this end we propose a human-robot collaborative control framework based on human intention recognition and sound localization. The intentions of head movement of the remote user are inferred based on the motion of the VR device using hidden Markov models. The speaker is tracked through sound source localization using a microphone array. A collaborative control scheme is developed to fuse the control from the robot and the remote user. Experiments are conducted in both one-to-one and one-to-two remote conversation scenarios. The results show that the proposed system can significantly improve the immersiveness and performance of robotic telepresence systems, therefore greatly enhancing the user experience of such telepresence systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call