Abstract

In the challenging environment, a reliable and autonomous navigation is the key performance factor for robotics. This paper presents a bio-inspired multi-sensor navigation system with the skylight compass and visual place recognition for the UAVs in the GNSS-denied environment. The system can integrate the position and heading constraints outputs from a skylight polarized sensor, a micro-inertia sensor and a monocular camera. An optimal orientation algorithm is proposed with the skylight polarized patterns for obtaining the heading constraint. We also propose a two-dimensional visual place recognition for providing the position constraint in the aerial environment and the metric encoding pattern and place recognition are presented in detail. Based on the position and heading constraints, a biologically multi-sensor integrating system with Kalman Filter has been developed to accurately estimate the motion information for UAVs. Unlike previous approaches having a fragile navigation scheme and path-error accumulation, our system can build a bionic and robust association between the motion metric information and the sensor inputs, and use reliable position and heading constraints for improving the navigation performance. The flying experiments have demonstrated that the proposed bio-inspired navigation system can outperform other vision-based navigation algorithms and the position errors are convergent and do not grow with the flying distance, in which the best performance of position precision is within 5m. Finally, we present interesting insights gained with respect to further work in robotics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call