Abstract

Fundus images captured for clinical diagnosis usually suffer from degradation factors due to variation in equipment, operators, or environment. These degraded fundus images need to be enhanced to achieve better diagnosis and improve the results of downstream tasks. As there is no paired low- and high-quality fundus image, existing methods mainly focus on supervised or semi-supervised learning methods for color fundus image enhancement (CFIE) tasks by utilizing synthetic image pairs. Consequently, domain gaps between real images and synthetic images arise. With respect to existing unsupervised methods, the most important low scale pathological features and structural information in degraded fundus images are prone to be erased after enhancement. To solve these problems, an unsupervised GAN is proposed for CFIE tasks utilizing adversarial training to enhance low quality fundus images. Synthetic image pairs are no longer required during the training. A specially designed U-Net with skip connection in our enhancement network can effectively remove degradation factors while preserving pathological features and structural information. Global and local discriminators adopted in the GAN lead to better illumination uniformity in the enhanced fundus image. To better improve the visual quality of enhanced fundus images, a novel non-reference loss function based on a pretrained fundus image quality classification network was designed to guide the enhancement network to produce high quality images. Experiments demonstrated that our method could effectively remove degradation factors in low-quality fundus images and produce a competitive result compared with previous methods in both quantitative and qualitative metrics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.