Abstract

The increasing throughput of experiments in biomaterials research makes automatic techniques more and more necessary. Among all the characterization methods, microscopy makes fundamental contributions to biomaterials science where precisely focused images are the basis of related research. Although automatic focusing has been widely applied in all kinds of microscopes, defocused images can still be acquired now and then due to factors including background noises of materials and mechanical errors. Herein, we present a deep-learning-based method for the automatic sorting and reconstruction of defocused cell images. First, the defocusing problem is illustrated on a high-throughput cell microarray. Then, a comprehensive dataset of phase-contrast images captured from varied conditions containing multiple cell types, magnifications, and substrate materials is prepared to establish and test our method. We obtain high accuracy of over 0.993 on the dataset using a simple network architecture that requires less than half of the training time compared with the classical ResNetV2 architecture. Moreover, the subcellular-level reconstruction of heavily defocused cell images is achieved with another architecture. The applicability of the established workflow in practice is finally demonstrated on the high-throughput cell microarray. The intelligent workflow does not require a priori knowledge of focusing algorithms, possessing widespread application value in cell experiments concerning high-throughput or time-lapse imaging.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.