Abstract

BackgroundRetinal vessel segmentation is critical for ocular health assessments. Traditional methods may lack precision, prompting exploration of advanced techniques. U-Net, a deep learning architecture, shows promise in handling the intricate nature of retinal vessel segmentation. MethodologyThis study focuses on the segmentation of thermographic fundus images using the U-Net architecture. A dataset of 125 images, categorized as normal and abnormal, underwent preprocessing, normalization, and augmentation. The U-Net model, with its contracting, bottleneck, and expansive paths, was implemented for accurate segmentation. A handheld thermographic fundus imaging product was introduced, featuring with Human Computer Interaction and user-friendly interface to optimize interaction and streamline the diagnostic process. ResultsThe segmentation accuracy achieved using U-Net stood at a promising 93.5%. Precision, recall, and F1-score metrics were employed for a detailed evaluation, showcasing the model's ability to identify abnormalities while minimizing false positives. The integration of a thermographic fundus imaging product significantly reduced processing time, demonstrating potential clinical utility. Leave-One-Out Cross-Validation affirmed the model's consistency, achieving an overall accuracy of 93.7%. A comparative analysis revealed U-Net's superiority over the Fully Convolutional Network (FCN) by 7%. ConclusionThis study establishes U-Net's efficacy in thermographic fundus image segmentation, offering precision and efficiency enhancements. The proposed imaging product streamlines diagnostics, emphasizing U-Net's superiority over FCN in retinal vessel segmentation, contributing to advanced medical image analysis.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.