Abstract

Autonomous mobile robots (AMRs) require SLAM technology for positioning and mapping. Their accuracy and real-time performance are the keys to ensuring that the robot can safely and accurately complete the driving task. The visual SLAM systems based on feature points have high accuracy and robustness but poor real-time performance. A lightweight Visual Odometry (VO) based on Lucas–Kanade (LK) optical flow tracking is proposed. Firstly, a robust key point matching relationship between adjacent images is established by using a uniform motion model and a pyramid-based sparse optical flow tracking algorithm. Then, the grid-based motion statistics algorithm and the random sampling consensus algorithm are used to eliminate the mismatched points in turn. Finally, the proposed algorithm and the ORB-SLAM3 front-end are compared in a dataset to verify the effectiveness of the proposed algorithm. The results show that the proposed algorithm effectively improves the real-time performance of the system while ensuring its accuracy and robustness.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call