Abstract

The robustness of dense-visual-odometry is still a challenging problem if moving objects appear in the scene. In this paper, we propose a form of dense-visual-odometry to handle a highly dynamic environment by using RGB-D data. Firstly, to find dynamic objects, we propose a multi-frame based residual computing model, which takes a far time difference frame into consideration to achieve the temporal consistency motion segmentation. Then the proposed method combines a scene clustering model and a nonparametric statistical model to obtain weighted cluster-wise residuals, as the weight describes how importantly a cluster residual is considered. Afterward, the motion segmentation labeling and clusters’ weights are added to the energy function optimization of dense-visual-odometry to reduce the influence of moving objects. Finally, the experimental results demonstrate that the proposed method has better performance than the state-of-the-art methods on many challenging sequences from a benchmark dataset, especially on highly dynamic sequences.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.