Abstract

This letter presents a novel method for visual odometry estimation from a RGB-D camera. The camera motion is estimated by aligning a source to a target RGB-D frame using an intensity-assisted iterative closest point (ICP) algorithm. The proposed method differs from the conventional ICP in following aspects. 1) To reduce the computational cost, salient point selection is performed on the source frame, where only points that contain valuable information for registration are used. 2) To reduce the influence of outliers and noises, robust weighting function is proposed to weight corresponding pairs based on statistics of their spatial distances and intensity differences. 3) The obtained robust weighting function from 2) is used for correspondence estimation of the following ICP iteration. The proposed method runs in real-time with a single core CPU thread, hence it is suitable for robots with limited computation resources. The evaluation on TUM RGB-D benchmark shows that in the majority of the tested sequences, our proposed method outperforms state-of-the-art accuracy in terms of translational drift per second with a computation speed of 78 Hz.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.