Abstract
SummarySegmentation of diabetic retinopathy is a significant requirement for correct severity estimation level. Due to dissimilarity in shapes, size, and quantity of retinal lesions, the procedure of labor‐intensive type of grading becomes a more time‐consuming and challenging task. It imposes the prerequisite for an automated segmentation framework that can accurately identify the regions with margins and aid ophthalmologists in the early detection and monitoring of the severity level. The proposed work introduces a customized U‐Net model based on the InceptionV3 pre‐trained CNN network. It uses a periodic shuffling approach along with sub‐pixel convolution initialized to convolution nearest‐neighbor resize. The model performed training and validation for non‐proliferative diabetic retinopathy that is, microaneurysms, hemorrhages, hard and soft exudates on two standard datasets such as IDRiD and DIARETDB1. The proposed model achieves 99.82% accuracy, 88.39% sensitivity, and 99.94% specificity for hemorrhages, 99.95% accuracy, 70.76% sensitivity, and 99.98% specificity for microaneurysms, 99.83% accuracy, 85.9% sensitivity, and 99.95% specificity for hard exudates, along with 99.62% accuracy, 85.38% sensitivity, and 99.71% specificity for soft exudates segmentation on the IDRID dataset. In comparison with both datasets, the proposed system on the IDRID dataset performed accurately with superior state‐of‐the‐art outcomes for segmentation of the retinal lesions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Concurrency and Computation: Practice and Experience
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.