Abstract

Abstract This study develops a telepresence system, Teleyes, to reduce visual distortion in remote environments. The primary objective of this research is to take advantage of state-of-art three-dimensional (3D) input/outputs technologies and to develop an avatar-like mechanism to synchronize the physical behavior of an operator with a remote system. Two 3D input/output methods are used in this research: stereoscopic vision and motion tracking. The system is designed to work under closed-loop control involving human feedback. Two cameras that are optimized according to the human eye angle-of-view to accommodate for the perspective of human eyes are used as stereoscopic vision inputs on the unmanned vehicle. At the operator end, a head mounted display is used to display the stereoscopic image and to track the operator's head movement with embedded sensors. The head-movement tracking data are interpreted into control signals and returned to the unmanned vehicle to control a three-axis gimbal mechanism, on which the two cameras are installed. The Teleyes system has been validated using a designed experimental application scenario and compared with current methods using five different operators. On average, the proposed system reduced the distance error of operators by 45.1% and the time usage by 18.2%. The results show that the system has significantly improved the visual experience and operating efficiency, thus having the potential to save resources and expand the application of unmanned vehicle systems (UVSs). The developed system provides the operator with a realistic first-person view of a UVS and a visual experience similar to being onboard.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call