Abstract

Demand is growing for unmanned air vehicles (UAVs) with greater autonomy, including the ability to navigate without GPS information, such as indoors. In this work, a novel visual odometry algorithm is developed and flight tested. It uses sequential pairs of red, green, blue, depth (RGBD) camera images to estimate the UAV’s change in position (delta pose), which can be used to aid a navigation filter. Unlike existing related techniques, it uses a novel perturbation approach to estimate the uncertainty of the odometry measurement dynamically in real time, a technique that is applicable to a wide range of sensor preprocessing tasks aimed at generating navigation-relevant measurements. Real-time estimates of the delta pose and its covariance allow these estimates to be efficiently fused with other sensors in a navigation filter. Indoor flight testing was performed with motion capture, which demonstrated that the odometry and covariance estimates are accurate when appropriately scaled. Flights also demonstrated the algorithm used in a navigation filter to improve a velocity estimate, which represents a significant improvement over the state of the art for RGBD odometry.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call