Abstract
Gleason grading system is dependable for quantifying prostate cancer. This paper introduces a fast multiphoton microscopic imaging method via deep learning for automatic Gleason grading. Due to the contradiction between multiphoton microscopy (MPM) imaging speed and quality, a deep learning architecture (SwinIR) is used for image super-resolution to address this issue. The quality of low-resolution image is improved, which increased the acquisition speed from 7.55 s per frame to 0.24 s per frame. A classification network (Swin Transformer) was introduced for automated Gleason grading. The classification accuracy and Macro-F1 achieved by training on high-resolution images are respectively 90.9% and 90.9%. For training on super-resolution images, the classification accuracy and Macro-F1 are respectively 89.9% and 89.9%. It shows that super-resolution image can provide a comparable performance to high-resolution image. Our results suggested that MPM joint image super-resolution and automatic classification methods hold the potential to be a real-time clinical diagnostic tool for prostate cancer diagnosis.
Submitted Version
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have