Abstract

Fluorescein angiography (FA) is a diagnostic method for observing the vascular circulation in the eye. However, it poses a risk to patients. Therefore, generative adversarial networks have been used to convert retinal fundus structure images into FA images. Existing high-resolution image generation methods employ complex deep network models that are challenging to optimize, which leads to issues such as blurred lesion boundaries and poor capture of microleakage and microvessels. In this study, we propose a multiple-ResNet generative adversarial network (GAN) to improve model training, thereby enhancing the ability to generate high-resolution FA images. First, the structure of the multiple-ResNet generator is designed to enhance detail generation in high-resolution images. Second, the Gaussian error linear unit (GELU) activation function is used to help the model converge rapidly. The effectiveness of the multiple-ResNet is verified using the publicly available Isfahan MISP dataset. Experimental results show that our method outperforms other methods, achieving better quantitative results with a mean structural similarity of 0.641, peak signal-to-noise ratio of 18.25, and learned perceptual image patch similarity of 0.272. Compared with state-of-the-art methods, the results showed that using the multiple-ResNet framework and GELU activation function can improve the generation of detailed regions in high-resolution FA images.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.