Abstract
Two-photon microscopy is indispensable in cell and molecular biology for its high-resolution visualization of cellular and molecular dynamics. However, the inevitable low signal-to-noise conditions significantly degrade image quality, obscuring essential details and complicating morphological analysis. While existing denoising methods such as CNNs, Noise2Noise, and DeepCAD serve broad applications in imaging, they still have limitations in preserving texture structures and fine details in two-photon microscopic images affected by complex noise, particularly in sophisticated structures like neuronal synapses. To improve two-photon microscopy image denoising effectiveness, by experimenting on real two-photon microscopy images, we propose a novel deep learning framework, the UNet-Att model, which integrates a specifically tailored UNet++ architecture with attention mechanisms. Specifically, this approach consists of a sophisticated downsampling module for extracting hierarchical features at varied scales, and an innovative attention module that strategically emphasizes salient features during the integration process. The architecture is completed by an ingenious upsampling pathway that reconstructs the image with high fidelity, ensuring the retention of textural integrity. Additionally, the model supports end-to-end training, optimizing its denoising efficacy. The UNet-Att model proves to surpass mainstream algorithms in the dual objectives of denoising and preserving the textural intricacies of original images, which is evidenced by an increase of 9.42 dB in the high Peak Signal-to-Noise Ratio (PSNR) coupled with an improvement of 0.1131 in the Structural Similarity Index Measurement (SSIM). The ablation experiments reveal the effectiveness of each module. The associated Python packages and datasets of UNet-Att are freely available at https://github.com/ZjjDh/UNet-Att.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.