Abstract

This paper presents a perception-aware path planning framework for unmanned aerial vehicles (UAVs) that explicitly considers perception quality of a light detection and ranging (LiDAR) sensor. The perception quality is quantified based on how scattered feature points are in LiDAR-based simultaneous localization and mapping, which can improve the accuracy of pose estimation of UAVs. In the planning step of a UAV, the proposed framework selects the best path based on the perception quality from a library of candidate paths generated by the rapidly-exploring random trees algorithm. Consequently, the UAV can autonomously fly to a destination in a receding horizon manner. Several simulation trials of the photorealistic environments confirm that our proposed path planner reduces pose estimation error by approximately 85 % on average as compared with a purely-reactive path planner.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.