Abstract

SUMMARYA novel pure-vision egomotion estimation algorithm is presented, with extensions to Unmanned Aerial Systems (UAS) navigation through visual odometry. Our proposed method computes egomotion in two stages using panoramic images segmented into sky and ground regions. Rotations (in 3DOF) are estimated by using a customised algorithm to measure the motion of the sky image, which is affected only by the rotation of the aircraft, and not by its translation. The rotation estimate is then used to derotate the optic flow field generated by the ground, from which the translation of the aircraft (in 3DOF) is estimated by another customised, iterative algorithm. Segmentation of the rotation and translation estimations allows for a partial relaxation of the planar ground assumption, inherently increasing the robustness of the approach. The translation vectors are scaled using stereo-based height to compute the current UAS position through path integration for closed-loop navigation. Outdoor field tests of our approach in a small quadrotor UAS suggest that the technique is comparable to the performance of existing state-of-the-art vision-based navigation algorithms, whilst also removing all dependence on additional sensors, such as an IMU or global positioning system (GPS).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call