Abstract
To evaluate the application of an intelligent diagnostic model for pterygium. For intelligent diagnosis of pterygium, the attention mechanisms-SENet, ECANet, CBAM, and Self-Attention-were fused with the lightweight MobileNetV2 model structure to construct a tri-classification model. The study used 1220 images of three types of anterior ocular segments of the pterygium provided by the Eye Hospital of Nanjing Medical University. Conventional classification models-VGG16, ResNet50, MobileNetV2, and EfficientNetB7-were trained on the same dataset for comparison. To evaluate model performance in terms of accuracy, Kappa value, test time, sensitivity, specificity, the area under curve (AUC), and visual heat map, 470 test images of the anterior segment of the pterygium were used. The accuracy of the MobileNetV2+Self-Attention model with 281 MB in model size was 92.77%, and the Kappa value of the model was 88.92%. The testing time using the model was 9ms/image in the server and 138ms/image in the local computer. The sensitivity, specificity, and AUC for the diagnosis of pterygium using normal anterior segment images were 99.47%, 100%, and 100%, respectively; using anterior segment images in the observation period were 88.30%, 95.32%, and 96.70%, respectively; and using the anterior segment images in the surgery period were 88.18%, 94.44%, and 97.30%, respectively. The developed model is lightweight and can be used not only for detection but also for assessing the severity of pterygium.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.