Abstract
This work presents a vision based system for navigation on a vertical takeoff and landing unmanned aerial vehicle (UAV). This is a monocular vision based, simultaneous localization and mapping (SLAM) system, which measures the position and orientation of the camera and builds a map of the environment using a video stream from a single camera. This is different from past SLAM solutions on UAV which use sensors that measure depth, like LIDAR, stereoscopic cameras or depth cameras. Solution presented in this paper extends and significantly modifies a recent open‐source algorithm that solves SLAM problem using approach fundamentally different from a traditional approach. Proposed modifications provide the position measurements necessary for the navigation solution on a UAV. The main contributions of this work include: (1) extension of the map building algorithm to enable it to be used realistically while controlling a UAV and simultaneously building the map; (2) improved performance of the SLAM algorithm for lower camera frame rates; and (3) the first known demonstration of a monocular SLAM algorithm successfully controlling a UAV while simultaneously building the map. This work demonstrates that a fully autonomous UAV that uses monocular vision for navigation is feasible.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.