Abstract

The state-of-art simultaneous localization and mapping (SLAM) methods have been demonstrated of satisfactory results in static environments. However, most of the existing methods cannot be directly applied in dynamic environments to account for moving objects. To solve the practical problem of SLAM in dynamic environments, an RGB-D SLAM method is developed based on moving objects removal and dense map reconstruction. Specifically, deep learning-based instance segmentation is performed to obtain the semantic information which is furthermore combined with multiview geometry theory to detect possible moving objects. Kalman filter and feature fusion algorithms are further used for moving objects tracking to improve the detection accuracy. Then the camera pose is estimated using the static feature points after motion removal to realize reliable localization in dynamic environments. Out of concern for the infeasibility of navigation and perception with sparse feature maps, each-single-view of dense map is constructed using the RGB image and the depth map collected by the RGB-D camera after motion removal. The dense environmental map is finally reconstructed through the registration of point cloud of selected key image frames. Experiments are conducted on public datasets and real scene data to evaluate the performance of the proposed method. Compared with the state-of-art method ORB-SLAM2, the absolute trajectory error and the relative pose error obtained by the proposed method are significantly reduced by at least 91%, and the localization accuracy reaches 0.001 m.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.