Abstract

Localization in unknown environments is an essential requirement for vision navigation of robotic vehicles in intelligent transportation systems. However, moving objects in dynamic scenarios usually bring about great difficulty for robot localization, because motion estimation of robotic vehicles is disturbed by increasing feature outliers caused by moving objects. In order to improve the accuracy and robustness of robot localization, a novel saliency-induced moving object detection (SMOD) approach is proposed to filter out feature outliers for RGB-D-based simultaneous localization and mapping (SLAM) in complex dynamic workspaces. Firstly, three complementary motion saliency potentials, including motion energy (ME), spatiotemporal objectness (STO), and dynamic superpixels (DS), are modeled by fully analyzing spatial, temporal, appearance and depth cues in RGB-D inputs. They can be used to identify the dynamic objects effectively from diversely changing backgrounds. Then, a superpixel-level graph-based motion saliency (MS) measure is proposed to generate the MS map for reliable localization of the moving objects. The edge weights and background nodes on the graph are determined reasonably by fusing ME, STO, and DS, which is not vulnerable to background interferences. Furthermore, the SMOD approach is embedded into the front-end of ORB-SLAM3 as a pre-processing stage, in order to filter out feature outliers associated with the moving objects. Finally, the extensive experiments are performed to verify the accuracy and robustness of the proposed approach on the public dynamic datasets. The experimental results show that the SMOD method can detect the moving objects effectively in a variety of challenging dynamic environments, and separate the dynamic regions reliably from the irrelevant background. The data comparison demonstrates that the SMOD-SLAM navigation system can outperform other state-of-the-art dynamic visual SLAM (vSLAM) systems.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.