Abstract

This paper proposes a novel teleoperation method that allows users to guide robots hand by hand along with speech. In this method, the virtual robot modeled according to the remote real robot is projected into the real local environment to form a 3D operation interface. In this case, users can directly interact with virtual objects by their hands. Furthermore, since the Leap Motion is attached to the augmented reality (AR) glasses, the operation space is greatly extended. Therefore, users can observe the virtual robot from an arbitrary angle without blind angle in such a mobile pattern, which enhances the users' interactive immersion and provides more natural human-machine interaction. To improve the accuracy of the measurement, an unscented Kalman filter (UKF) and an improved particle filter (IPF) are used to estimate the position and orientation of the hand, respectively. Furthermore, Term Frequency-Inverse Document Frequency (TF-IDF) and maximum entropy model are adopted to recognize the speech and gestures instructions of the user. The proposed method is compared with the three human-machine methods on various experiments. The results verified the effectiveness of the proposed method.

Highlights

  • Nowadays, robots have played an increasingly pivotal role in the development of technology

  • This paper presented a natural mobile human-machine interactive method by using the augmented reality technology to avoid the significant borders between the virtual robots and the real hands of the operator

  • The Leap Motion can capture the motion of the hand and map it to the virtual world to guide the virtual robot

Read more

Summary

INTRODUCTION

Robots have played an increasingly pivotal role in the development of technology. To address the above problems, this paper proposes a novel natural mobile human-machine interactive method In this method, a mobile gesture sensor and an augmented reality (AR) wearable device are employed to establish an effective fusion between virtual robots and real hands so that an operator with the gestures and speech can naturally interact with the virtual robot. The main contributions of this paper are summarized as follows: 1) We propose a natural mobile human-machine interactive method, which adopts the augmented reality technology to avoid the clear borders between the virtual robots and the real hands of the operator. OVERVIEW The intuitive interaction process of the proposed mobile human-machine interactive method is shown, where two real robots (see Fig. 1(e)) in the remote environment has been projected into two virtual robots (see Fig. 1(a)).

COORDINATE REGISTRATION
INTERACTIVE DETECTION
POSITION AND ORIENTATION ESTIMATION
POSITION ESTIMATION WITH UKF
ORIENTATION ESTIMATION USING IPF
ELIMINATE THE EFFECT OF SHAKING HEAD
MULTIMODAL INSTRUCTION GENERATION
DISCUSSION
Findings
VIII. CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.