Abstract

We propose a solution towards the problem of autonomous flight in man-made indoor environments with a micro aerial vehicle (MAV), using a frontal camera, a downward-facing sonar, and odometry inputs. While steering an MAV towards distant features that we call vistas, we build a map of the environment in a parallel tracking and mapping fashion to infer the wall structure and avoid lateral collisions in real-time. Our framework overcomes the limitations of traditional monocular SLAM approaches that are prone to failure when operating in feature-poor environments and when the camera purely rotates. First, we overcome the common dependency on feature-rich environments by detecting Wall–Floor Features(WFFs), a novel type of low-dimensional landmarks that are specifically designed for man-made environments to capture the geometric structure of the scene. We show that WFFs not only reveal the structure of the scene, but can also be tracked reliably. Second, we cope with difficult robot motions and environments by fusing the visual data with odometry measurements in a principled manner. This allows the robot to continue tracking when it purely rotates and when it temporarily navigates across a completely featureless environment. We demonstrate our results on a small commercially available quad-rotor platform flying in a typical feature-poor indoor environment.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.