Abstract
In this work we present initial results of a system that combines wearable technology and monocular simultaneous localisation and mapping (SLAM) for remote controlling of a low-cost micro aerial vehicle (MAV) that flies beyond the visual line-of-sight. To this purpose, as a first step, we use a state-of-the-art visual SLAM system, called ORB-SLAM, to create a 3D map of the scene. The visual data feeding ORB-SLAM is obtained from imagery transmitted from the on-board camera of our low-cost vehicle. This vehicle can not process data on board, however, it can transmit images at a rate of 15–20 Hz, which we found sufficient to carry out the visual localisation and mapping. The second step in our system is to replace the conventional controller with a pair of wearable-sensor-based gloves worn by the user so he/she can command the MAV by only performing hand gestures. Our goal is to show that the user can fly the vehicle beyond the line-of-sight by only using the vehicle's pose and map estimates in real time and that commanding the MAV with hand gestures will enable him/her to focus more on the flight task. Our preliminary results indicate the feasibility of our approach.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.