Abstract
Biological images captured by microscopes are characterized by heterogeneous signal-to-noise ratios (SNRs) due to spatially varying photon emission across the field of view convoluted with camera noise. State-of-the-art unsupervised structured illumination microscopy (SIM) reconstruction algorithms, commonly implemented in the Fourier domain, do not accurately model this noise and suffer from high-frequency artifacts, user-dependent choices of smoothness constraints making assumptions on biological features, and unphysical negative values in the recovered fluorescence intensity map. On the other hand, supervised methods rely on large datasets for training, and often require retraining for new sample structures. Consequently, achieving high contrast near the maximum theoretical resolution in an unsupervised, physically principled, manner remains an open problem. Here, we propose Bayesian-SIM (B-SIM), an unsupervised Bayesian framework to quantitatively reconstruct SIM data, rectifying these shortcomings by accurately incorporating known noise sources in the spatial domain. To accelerate the reconstruction process, we use the finite extent of the point-spread-function to devise a parallelized Monte Carlo strategy involving chunking and restitching of the inferred fluorescence intensity. We benchmark our framework on both simulated and experimental images, and demonstrate improved contrast permitting feature recovery at up to 25% shorter length scales over state-of-the-art methods at both high- and low-SNR. B-SIM enables unsupervised, quantitative, physically accurate reconstruction without the need for labeled training data, democratizing high-quality SIM reconstruction and expands the capabilities of live-cell SIM to lower SNR, potentially revealing biological features in previously inaccessible regimes.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.