Abstract
Diabetic retinopathy (DR) is a chronic eye disease and the main reason of blindness between adults. Manual DR screening methods require skilled readers, effort, and time. Automated DR detection and classification of its severity is important for early and effective disease control. Mobile devices can facilitate regular screening of retinal fundus images using automated lightweight architectures techniques. While, many automated techniques have been proposed in the literature, new models are needed to improve the accuracy and to facilitate the accessibility to such models. In this study, MobileNetV2 is used to classify five classes of DR severity that includes no DR, mild DR, moderate DR, severe DR, and proliferative DR. Colour fundus images are enhanced using contrast-limited adaptive histogram equalization (CLAHE) to defeat bad lighting conditions. MobileNetV2 is used to extract image features. A new classification head based on batch normalization and L2 regularization is applied on the extracted features. The model is tested using augmented data, non-augmented data, fine-tuning, and without fine-tuning. The experiments indicate the importance of augmentation and fine-tuning for improving the evaluation metrics. The experimental results show the superiority of the proposed technique compared to recent techniques in literature with training, validation, and test accuracies of 98.65, 87.94%, and 87.67% respectively.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Computer Science and Mobile Computing
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.