Abstract
This paper presents a novel dense optical-flow algorithm to solve the monocular simultaneous localisation and mapping (SLAM) problem for ground or aerial robots. Dense optical flow can effectively provide the ego-motion of the vehicle while enabling collision avoidance with the potential obstacles. Existing research has not fully utilised the uncertainty of the optical flow—at most, an isotropic Gaussian density model has been used. We estimate the full uncertainty of the optical flow and propose a new eight-point algorithm based on the statistical Mahalanobis distance. Combined with the pose-graph optimisation, the proposed method demonstrates enhanced robustness and accuracy for the public autonomous car dataset (KITTI) and aerial monocular dataset.
Highlights
Uncrewed aerial vehicles (UAVs) have drawn significant attention from the research community and industry in the last few decades
The UAV performs motions such as pure rotation and extreme height variation. These make an accurate estimation of camera pose difficult for existing monocular visual odometry and simultaneous localisation and mapping (SLAM)
This paper presented a new Mahalanobis eight-point algorithm using the dense optical flow
Summary
Uncrewed aerial vehicles (UAVs) have drawn significant attention from the research community and industry in the last few decades. Most monocular visual odometry methods use sparse feature points matched between images to compute the inter-frame motion [9,10,11]. The sparse matches may be clustered around a small area of the image or encounter problems with the planar degeneracy [16], resulting in a biased motion estimate These methods have not explicitly considered the uncertainty in the feature matching and monocular SLAM pipelines. We propose a novel robust monocular simultaneous localisation and mapping in a principled probabilistic framework This is accomplished by using dense optical flow with estimated uncertainty as to the input to our visual odometry pipeline.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.