Abstract
Early detection of keratoconus will provide more treatment choices, avoid heavy treatments, and help stop the rapid progression of the disease. Unlike traditional methods of keratoconus classification, this study presents a machine learning-based keratoconus classification approach, using transfer learning, applied on corneal topographic images. Classification is performed considering the three corneal classes already cited : normal, suspicious and keratoconus. Keratoconus classification is carried out using six pretrained convolutional neural networks (CNN) VGG16, InceptionV3, MobileNet, DenseNet201, Xception and EfficientNetB0. Each of these different classifiers is trained individually on five different datasets, generated from an original dataset of 2924 corneal topographic images. Original corneal topographic images have been subjected to a special preprocessing before their use by different models in the learning phase. Images of corneal maps are separated in five different datasets while removing noise and textual annotation from images. Most of models used in the classification allow good discrimination between normal cornea, suspicious and keratoconus one. Obtained results reached classification accuracy of 99.31% and 98.51% by DenseNet201 and VGG16 respectively. Obtained results indicate that transfer learning technique could well improve performance of keratoconus classification systems.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Online and Biomedical Engineering (iJOE)
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.