Abstract

The time-correlated single-photon-counting (TCSPC) three-dimensional (3D) imaging lidar system has broad application prospects in the field of low-light 3D imaging because of its single-photon detection sensitivity and picoseconds temporal resolution. However, conventional TCSPC systems always limit the echo photon flux to an ultra-low level to obtain high-accuracy depth images, thus needing to spend amounts of acquisition time to accumulate sufficient photon detection events to form a reliable histogram. When the echo photon flux is increased to medium or even high, the data acquisition time can be shortened, but the photon pile-up effect can seriously distort the photon histogram and cause depth errors. To realize high accuracy TCSPC depth imaging with a shorter acquisition time, we propose a high-flux fast photon-counting 3D imaging method based on empirical depth error correction. First, we derive the photon flux estimation formula and calculate the depth error of our photon-counting lidar under different photon fluxes with experimental data. Then, a function correction model between the depth errors and the number of echo photons is established by numerical fitting. Finally, the function correction model is used to correct depth images at high photon flux with different acquisition times. Experimental results show that the empirical error correction method can shorten the image acquisition time by about one order of magnitude while ensuring a moderate accuracy of the depth image.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.