Abstract

Optical microscopy imaging provides cellular-level visualization of tissues, which has therefore become the gold standard for the diagnosis of cancers. Despite the significant contributions of optical microscopy to pathology, a fundamental pre-requist is to capture high-quality and all-in-focus images at high speed. In this work, we propose a Kernel Distillation AutoFocus (KDAF) method to predict the defocus distance based on a single-frame image. This work takes the advantages from both previous virtual-refocusing and single-shot focus estimation methods. During the training phase, the focus distance is formulated as a latent variable to determine the kernel of defocus blurring. The in-focus samples are processed via the kernel to generate the defocused images via the conditional convolutional neural networks. During the testing phase, we directly predict the focus distance via the latent features. Compared with previous works, the proposed method introduces the paired focused/defocused samples as strong supervision to learn the representative features for focus estimation. Experiments on BBBC and Incoherent datasets demonstrate that the proposed can effectively estimate the optimal focus distance. This method can accurately estimate the focus distance based on single-shot imaging. Accurate and automated focus estimation may potentially improve the imaging quality of microscopes, which can benefit the robust diagnosis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call