Abstract

Oriented feature from the accelerated segment test (oFAST) and rotated binary robust independent elementary features (rBRIEF) SLAM2 (ORB-SLAM2) represent a recognized complete visual simultaneous location and mapping (SLAM) framework with visual odometry as one of its core components. Given the accumulated error problem with RGB-Depth ORB-SLAM2 visual odometry, which causes a loss of camera tracking and trajectory drift, we created and implemented an improved visual odometry method to optimize the cumulative error. First, this paper proposes an adaptive threshold oFAST algorithm to extract feature points from images and rBRIEF is used to describe the feature points. Then, the fast library for approximate nearest neighbors strategy is used for image rough matching, the results of which are optimized by progressive sample consensus. The image matching precision is further improved by using an epipolar line constraint based on the essential matrix. Finally, the efficient Perspective-n-Point method is used to estimate the camera pose and a least-squares optimization problem is constructed to adjust the estimated value to obtain the final camera pose. The experimental results show that the proposed method has better robustness, higher image matching accuracy and more accurate determination of the camera motion trajectory.

Highlights

  • Oriented feature from the accelerated segment test and rotated binary robust independent elementary features simultaneous location and mapping 2 (ORB-SLAM2) [1,2,3,4] is a complete simultaneous location and mapping solution based on monocular, binocular and RGB-Depth (RGB-D) cameras

  • Compared with the oriented FAST and rotated BRIEF (ORB)-SLAM2 visual odometry method, the root mean square error (RMSE) and mean value of the translation error of the two sets of data in this algorithm are reduced by 37.48%, 31.93%, 42.99% and 32.78%, respectively; the rotation error is reduced by 35.86%, 29.85%, 37.87% and 37.59%, respectively

  • Visual odometry is one of the key technologies in the field of visual SLAM, which has a wide range of applications in autonomous navigation, automatic driving and augmented reality

Read more

Summary

Introduction

Oriented feature from the accelerated segment test (oFAST) and rotated binary robust independent elementary features (rBRIEF) simultaneous location and mapping 2 (ORB-SLAM2) [1,2,3,4] is a complete simultaneous location and mapping solution based on monocular, binocular and RGB-Depth (RGB-D) cameras. One of its contributions is to propose an efficient visual odometry (VO) method based on improved oriented feature from the accelerated segment test (FAST) and rotated BRIEF operators. A complete visual odometry consists of three main parts: (1) image feature extraction and matching, mainly including image feature detection, description and matching; (2) mismatched points culling, which can help improve image feature matching accuracy; and (3) motion pose estimation and triangulation measurement. This technology is widely used in robotic autonomous positioning, virtual reality, augmented reality and autonomous driving. The camera pose obtained using the proposed scheme has better robustness and higher precision

Related Work
Adaptive Threshold oFAST Algorithm for Feature Extraction Algorithm
New Image Matching Algorithm
Pose Estimation Algorithm
Experimental Data and Computing Environment
Experimental Results and Analysis of Image Feature Extraction Algorithm
Experimental Results and Analysis of Image Feature Matching Algorithm
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call