Abstract

RGB-D cameras, which can be attached to any mobile device and work under different operation platforms (e.g., iOS, Android, and Windows), have great potential for indoor 3D modeling and navigation due to their low cost and small size. The main problems of RGB-D cameras for such applications are their range limitations and deteriorated depth accuracy. For example, for a 7-m range, the distance error of structure sensor (one type of RGB-D camera) reaches nearly 0.5 m. We propose a new calibration procedure for RGB-D sensors to improve the depth accuracy. First, the baseline between RGB and IR cameras is calibrated using the direct linear transform method. The distortions of the RGB and IR cameras and the IR projector are then calibrated using the newly proposed two-lens distortion model. Finally, the remaining depth systematic errors are calibrated using an empirical model. Compared to existing calibration methods, the new calibration method considers distortions from both the IR camera and projector and significantly improves the accuracy of far-range depth measurements. The experimental results show that the proposed calibration method can precisely calibrate the full range of the RGB-D sensor, up to 7 m, with an overall depth accuracy of 1.9%, compared to the 5.5% accuracy of the manufacturer’s depth estimation. To demonstrate the significance of calibration in indoor mapping, the 3D point cloud of a room (4.5 m x 3.5 m) is generated using the RGB-D SLAM system. The accuracy of the 3D model with the proposed calibration method is approximately 1.5 cm, compared to 7.0 cm using the manufacturer’s calibration parameters.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.