Abstract

When using Time-Of-Flight(TOF) camera to obtain depth values, corner distortion and precision offset often occur. At present, the main methods to compensate depth errors are based on the techniques like error look-up table or curve fitting, which has a large amount of calculation resulting in slow compensation speed. By analyzing the depth error distribution law of TOF camera at different distances, a real-time and high-precision error compensation method was proposed. The error compensation model was simplifed by using the rotational symmetry of TOF depth image and the characteristics of error distribution. The order of magnitude of the parameters was reduced, and the accuracy and speed of compensation process were effectively improved. The proposed algorithm was applied to Kinect v2 depth sensor for depth compensation, the flatness error within the effective distance dropped to 0.63 mm, the average error dropped to 0.704 0 mm, and the single frame data compensation time was less than 90 ms. Since the algorithm compensates only based on the optical path difference, it is suitable for all TOF principle cameras. The results of experiments show that the proposed algorithm can quickly and effectively reduce the depth error of TOF camera, and is suitable for real-time, high-precision three-dimensional reconstruction of large field of view.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.