Abstract

Quantitative differential phase-contrast (qDPC) imaging is a label-free phase retrieval method for weak phase objects using asymmetric illumination. However, qDPC imaging with fewer intensity measurements leads to anisotropic phase distribution in reconstructed images. In order to obtain isotropic phase transfer function, multiple measurements are required; thus, it is a time-consuming process. Here, we propose the feasibility of using deep learning (DL) method for isotropic qDPC microscopy from the least number of measurements. We utilize a commonly used convolutional neural network namely U-net architecture, trained to generate 12-axis isotropic reconstructed cell images (i.e. output) from 1-axis anisotropic cell images (i.e. input). To further extend the number of images for training, the U-net model is trained with a patch-wise approach. In this work, seven different types of living cell images were used for training, validation, and testing datasets. The results obtained from testing datasets show that our proposed DL-based method generates 1-axis qDPC images of similar accuracy to 12-axis measurements. The quantitative phase value in the region of interest is recovered from 66% up to 97%, compared to ground-truth values, providing solid evidence for improved phase uniformity, as well as retrieved missing spatial frequencies in 1-axis reconstructed images. In addition, results from our model are compared with paired and unpaired CycleGANs. Higher PSNR and SSIM values show the advantage of using the U-net model for isotropic qDPC microscopy. The proposed DL-based method may help in performing high-resolution quantitative studies for cell biology.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.