Abstract

Laser speckle contrast imaging (LSCI) is a wide-field and noncontact imaging technology for mapping blood flow. Although the denoising method based on block-matching and three-dimensional transform-domain collaborative filtering (BM3D) was proposed to improve its signal-to-noise ratio (SNR) significantly, the processing time makes it difficult to realize real-time denoising. Furthermore, it is still difficult to obtain an acceptable level of SNR with a few raw speckle images given the presence of significant noise and artifacts. A feed-forward denoising convolutional neural network (DnCNN) achieves state-of-the-art performance in denoising nature images and is efficiently accelerated by GPU. However, it performs poorly in learning with original speckle contrast images of LSCI owing to the inhomogeneous noise distribution. Therefore, we propose training DnCNN for LSCI in a log-transformed domain to improve training accuracy and it achieves an improvement of 5.13 dB in the peak signal-to-noise ratio (PSNR). To decrease the inference time and improve denoising performance, we further propose a dilated deep residual learning network with skip connections (DRSNet). The image-quality evaluations of DRSNet with five raw speckle images outperform that of spatially average denoising with 20 raw speckle images. DRSNet takes 35 ms (i.e., 28 frames per second) for denoising a blood flow image with 486×648 pixels on an NVIDIA 1070 GPU, which is approximately 2.5 times faster than DnCNN. In the test sets, DRSNet also improves 0.15 dB in the PSNR than that of DnCNN. The proposed network shows good potential in real-time monitoring of blood flow for biomedical applications.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.