Abstract
Supreresolution (SR) fluorescence microscopy breaks the optical diffraction limit and achieves unprecedented nanoscale resolution, thus producing a huge impact on biological research. Structured illumination microscopy (SIM) is one of the most broadly used SR techniques. SIM applies varying, nonuniform illumination to samples. A dedicated computational algorithm is then used to retrieve SR information from nine or fifteen sequentially acquired images, for 2D or 2D/3D reconstruction respectively. Super-resolution radial fluctuation (SRRF) is a purely computational SR approach, that can retrieve high frequency information by spatio-temporally analyzing the stochastically fluctuating signals from image time series (typically >200 frames). We utilized deep neural networks to directly reconstruct super-resolved images out of a series of images taken with either structured or conventional wide-field illumination microscopy. This substantially reduced the number of images compared to state-of-the-art methods (as few as 3 images for SIM and 5 images for SRRF). Further, leveraging skip-layer connected U-Nets, we could restore high quality, high-resolution images from raw data with over 100 times less photons. We validated our results for different cellular structures, including microtubules, mitochondria, adhesions and actin filaments. We demonstrated its applications in ultra-fast, multi-color superresolution imaging of living cells. A ready-to-use open source package is available.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.