Abstract
Optical microscopy with adaptive optics (AO) allows high-resolution noninvasive imaging of subcellular structures in living organisms. As alternatives to hardware-based AO methods, supervised deep-learning approaches have recently been developed to estimate optical aberrations. However, these approaches are often limited in their generalizability due to discrepancies between training and imaging settings. Moreover, a corrective device is still required to compensate for aberrations in order to obtain high-resolution images. Here we describe a deep self-supervised learning approach for simultaneous aberration estimation and structural information recovery from a single 3D image stack acquired by widefield microscopy. The approach utilizes coordinate-based neural representations to represent highly complex structures. We experimentally validated our approach with directwavefront-sensing-based AO in the same samples and showed the approach is applicable to in vivo mouse brain imaging
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.