Abstract

Photon-efficient imaging, which captures 3D images with single-photon sensors, has enabled a wide range of applications. However, two major challenges limit the reconstruction performance, i.e., the low photon counts accompanied by low signal-to-background ratio (SBR) and the multiple returns. In this paper, we propose a unified deep neural network that, for the first time, explicitly addresses these two challenges, and simultaneously recovers depth maps and intensity images from photon-efficient measurements. Starting from a general image formation model, our network is constituted of one encoder, where a non-local block is utilized to exploit the long-range correlations in both spatial and temporal dimensions of the raw measurement, and two decoders, which are designed to recover depth and intensity, respectively. Meanwhile, we investigate the statistics of the background noise photons and propose a noise prior block to further improve the reconstruction performance. The proposed network achieves decent reconstruction fidelity even under extremely low photon counts / SBR and heavy blur caused by the multiple-return effect, which significantly surpasses the existing methods. Moreover, our network trained on simulated data generalizes well to real-world imaging systems, which greatly extends the application scope of photon-efficient imaging in challenging scenarios with a strict limit on optical flux. Code is available at https://github.com/JiayongO-O/PENonLocal.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.