Abstract
Diabetic retinopathy (DR) is a severe effect of diabetes mellitus that mainly impacts the retinal tissue and carries a high risk of blindness. Ophthalmologists face challenges in assessing the severity of DR due to its complexity and time constraints. Consequently, there is an urgent need for the development of automated methods that employ retinal fundus images to detect DR. This study introduces a novel deep learning architecture that utilizes a multi-scale residual attention block (MSRAB) and a cross-attention block (CrAB) for DR grading. The proposed MSRAB employs a convolutional neural network (CNN) with diverse dilation rates to expand its field of view. MSRAB adaptively concentrate on pertinent features and incorporates a residual attention network to improve grading performance. Integrating the residual attention network enables MSRAB to prioritize critical characteristics in retinal fundus images and enhance performance. Similarly, the CrAB integrates channel and spatial attention mechanisms to capture inter-channel interactions and spatial dependencies within the input features. This comprehensive methodology enables the model to concentrate more efficiently on critical regions to discriminate irrelevant features and to capture interrelations across different channels and spatial regions. The proposed model employs a pre-trained backbone network to extract local and global features to capture complex features for precisely identifying DR. This process enhances model efficiency, which is helpful in methods with limited data and processing limitations. The suggested methodology surpasses the existing methodologies in accuracy, recall, precision, and Area Under the Curve.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.