Abstract
The management of remote services, such as remote surgery, remote sensing, or remote driving, has become increasingly important, especially with the emerging 5G and Beyond 5G technologies. However, the strict network requirements of these remote services represent one of the major challenges that hinder their fast and large-scale deployment in critical infrastructures. This article addresses certain issues inherent in remote and immersive control of virtual reality (VR)-based unmanned aerial vehicles (UAVs), whereby a user remotely controls UAVs, equipped with 360° cameras, using their head-mounted devices (HMD) and their respective controllers. Remote and immersive control services, using 360° video streams, require much lower latency and higher throughput for true immersion and high service reliability. To assess and analyze these requirements, this article introduces a real-life testbed system that leverages different technologies (e.g., VR, 360° video streaming over 4G/5G, and edge computing). In the performance evaluation, different latency types are considered. They are namely: 1) glass-to-glass latency between the 360° camera of a remote UAV and the HMD display; 2) user/pilot’s reaction latency; and 3) the command/execution latency. The obtained results indicate that the responsiveness (dubbed Glass-to-Reaction-to-Execution—GRE–latency) of a pilot, using our system, to a sudden event is within an acceptable range, i.e., around 900 ms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.