Abstract

Mission scenarios beyond line of sight or with limited ground control station access require capabilities for autonomous safe navigation and necessitate a continuous extension of existing and potentially outdated information about obstacles. The presented approach is a novel synthesis of techniques for 3D environment perception and global path planning. A locally bounded sensor fusion approach is used to extract sparse obstacles for global incremental path planning in an anytime fashion. During the flight, a stereo camera checks the field of view along the flight path ahead by analyzing depth images. A 3D occupancy grid is built incrementally. To reduce the high data rate and storage demands of grid-type maps, an approximated polygonal world model is created. For a compacted representation, it uses prisms and ground planes. This enables the system to constantly renew and update its knowledge about obstacles. An incremental heuristic path planner uses both a-priori information as well as incremental obstacle updates to assure a collision-free path at any time. Mapping results from flight tests show the functionality of onboard world modeling from real sensor data. Path planning feasibility is demonstrated within a simulation environment considering world model changes inside the vehicle's field of view.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call