Abstract
Visual Simultaneous Localization and Mapping (VSLAM) system is considered to be a fundamental capability for autonomous mobile robots. However, most of the existing VSLAM algorithms adopt a strong scene rigidity assumption for analysis convenience, which ignored the influence of independently moving objects in the real environment on the accuracy of the SLAM system. Hence, this paper proposes MGC-VSLAM: Meshing-based and Geometric constraint VSLAM, a novel VSLAM algorithm for dynamic indoor environments, built on RGB-D mode of ORB-SLAM2, which relates to the problem of the ORB feature uniform distribution and the dynamic feature filtering. In detail, aiming at the problem of the over-uniform distribution of feature points extracted by the quadtree-based algorithm in ORB-SLAM2, a meshing-based feature uniform distribution algorithm is proposed. Meshes are divided at each layer of the image pyramid, and then a specific number of features in the meshes are reserved according to their Harris response value. In addition, aiming at the impact of features extracted from dynamic targets on the SLAM system, a dynamic feature filtering method is proposed. First, a stable matching relationship is established through a feature matching constraint method. Then a novel geometric constraint method is used to filter out the dynamic feature points in the scene. Only the remaining static features are reserved to achieve accurate camera pose estimation in dynamic environments. Experiments on the Oxford dataset and public TUM RGB-D dataset are conducted to evaluate the proposed approach. It revealed that the proposed MGC-VSLAM can effectively improve the positioning accuracy of ORB-SLAM2 in high-dynamic scenarios.
Highlights
Simultaneous Localization and Mapping (SLAM), as the core technology of intelligent mobile robots, refers to the fact that the robot simultaneously completes the positioning of the mobile robot itself and the map construction of the surrounding environment without any prior environmental information [1]–[2]
MGC-visual SLAM system (VSLAM) will be introduced from three main aspects: system framework, feature uniform distribution algorithm, and dynamic feature point filtering algorithm
The results show that the relative accuracy improvement of MGC-VSLAM is only lower than Lin’s system in the w_static sequence and lower than Dynaslam in the w_rpy sequence
Summary
Simultaneous Localization and Mapping (SLAM), as the core technology of intelligent mobile robots, refers to the fact that the robot simultaneously completes the positioning of the mobile robot itself and the map construction of the surrounding environment without any prior environmental information [1]–[2]. The standard ORB algorithm tend to concentrate on the strong texture regions As a consequence, these features cannot reflect the whole image well, and the number of matched features will be significantly reduced when the concentrated regions of two adjacent frames are different [9]. These features cannot reflect the whole image well, and the number of matched features will be significantly reduced when the concentrated regions of two adjacent frames are different [9] This problem makes the SLAM system to be unstable and more seriously will cause tracking lost [10]. Independent moving objects often appear in the scene It will introduce errors into the visual odometer estimation, and moving objects will be recorded in the resulting map, which makes the built maps unsuitable for subsequent robot intelligent capture, navigation, and other
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.