Abstract
In recent years, more and more robots have been equipped with low-cost RGB-D sensors, such as Microsoft Kinect and Intel Realsense, for safe navigation and active interaction with objects and people. In order to obtain more accurate and reliable fused color and depth information (coloured point clouds), not only the intrinsic and extrinsic parameters of color and depth sensor should be precisely calibrated but also the external corrections of depth measurements are required. In this paper, using motion capture system, we propose a reliable calibration framework that enables the precise estimation of the intrinsic and extrinsic parameters of RGB-D sensors and provide a model-free depth calibration method based on heteroscedastic Gaussian processes. Compared with the existing depth correction techniques, our method can simultaneously estimate the mean and variance of the depth error at different measurement distances, i.e., the probability distribution of the depth error relative to the measured distance, which is essential in the state estimation problems. To verify the effectiveness of our approach, we conduct a thorough qualitative and quantitative analysis of the major steps of our calibration method, and compare our experimental results with other related work. Furthermore, we demonstrate an experiment about the overall improvement of visual SLAM with a Kinect device calibrated by our calibration technique.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.