Abstract

Eye problems can lead to vision loss and have a significant impact on daily life, underscoring the critical importance of early diagnosis and treatment to prevent further damage and complications. A comprehensive segmentation-classification framework – VisionDeep-AI is developed in the proposed work for retinal vessel segmentation and multi-class classification on given fundus images. A weighted bi-directional feature pyramid network and U-Net backbone architecture-based customized deep-learning model is built to segment blood vessels which enhances feature extraction and multi-scale feature fusion. The architectural design comprises an end-to-end encoder-decoder network featuring six-depth layers with varying resolutions, enabling the extraction of high-level descriptors and lower-level, fine-grained characteristics. Furthermore, a multi-modal deep feature fusion architecture is developed for the multi-class classification of fundus images in four different categories by integrating features from segmented vessel images and raw fundus images, thereby accommodating more diversified information. A thorough analysis of VisionDeep-AI has been done on a comprehensive colour fundus image dataset. To ensure reliable results, data augmentation was done to prevent the over-fitting of the model and to enhance its generalization capability. The segmentation model performed exceptionally well and achieved a high accuracy of 97.73% and a dice coefficient of 89.90% in blood vessel segmentation from fundus images. Furthermore, multi-modal deep feature-fused classification architecture achieved a test accuracy of 81.50%, with a specificity of 93.83%. These findings showcase the robustness, generalizability and efficiency of the proposed VisionDeep-AI framework for fundus image segmentation and retinal disease classification, contributing towards advanced medical technologies and diagnosis precision.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call