Abstract

Optical microscopy imaging is the gold standard for the diagnosis of cancers since it allows the cell-level visualization of tissues. The high quality of imaging is largely determined by the focus distances between the lens and objects. Therefore, a robust and efficient auto focusing algorithm is required to obtain the optimal focus position, especially for the robot-assisted microscopy systems. In this letter, we propose a diversity-aware learning framework to predict the optimal focus position based on a single image, without any reference. For robust and accurate estimation, the two-point representation of distance to the optimal focus position is utilized for label distribution learning. To reduce the intra-class variation caused by the diversity of pathological slides, we present a intraclass discrepancy penalty term following the composite-loss and the gradient-domain input strategy to concentrate more on image focus quality. Experiments on real microscopy datasets demonstrate that the proposed method achieves the promising performance in terms of accuracy, real-time and generalization. The mean absolute error is 0.308 μm, which is within the depth-of-field of the microscope. It outperforms the previous no-reference approaches by 39%.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.