Abstract
Fundus imaging is used to diagnose many common diseases in human vision. Because of uneven brightness, blur effect, and low contrast, fundus images are not helpful for diagnosis. The proposed technique provides an effective enhancement technique for fundus images. The method performs the appropriate illumination improvement, details correction, and denoising operations. A maximum a posterior (MAP) estimator is used to calculate the illumination and reflection components, which is motivated by the retinex method. The estimated illumination part is put through gamma correction to smooth its unevenness. A straightforward visual transformation algorithm is applied to address the unequal luminosity at the structural layer; it is used to enhance the details of the structure layer. The suggested methods were examined on public datasets (STARE, CHASEDB1, and DRIVE). To further confirm the recommended strategy's benefit in aiding the diagnosis, ophthalmologists' quality assessments were also taken. The objective evaluation used performance parameters such as GMSD, NIQE, HPSI, VSI, MCSD, and PSNR. According to experimental results, the suggested method may simultaneously manage the duties of illumination enhancement, details improvement, noise, and artifact suppression, outperforming state-of-the-art methods. According to the qualitative, quantitative, and statistical examination, the suggested algorithm reliably improves low-contrast retinal photos with intact color and a natural appearance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.