Abstract
The majority of current Simultaneous Localization and Mapping (SLAM) systems operate under the assumption that the surroundings of an autonomous mobile robot are static. This assumption is impractical in dynamic urban office settings and undermines the effectiveness of current SLAM algorithms. We propose an enhanced visual-inertial odometry system that can accurately locate the mobile robot position in real-time in semi-dynamic environments while showing superior performance. To ensure real-time performance, the front end of the algorithm combines ORB features with optical flow matching. To address dynamic feature mismatches in temporarily stationary environments, we design an outlier elimination mechanism based on feature reprojection and Random Sample Consensus (RANSAC) principles. During the feature tracking process before calculating the current robot pose, outliers are processed and eliminated based on the small-scale motion assumption between frames, mitigating the impact of dynamic and temporarily stationary objects and thus enhancing the robustness of the algorithm. The effectiveness and accuracy of the algorithm are validated through quantitative comparisons with state-of-the-art algorithms on the EuRoC dataset.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.