Abstract

Regular screening and timely treatment play a crucial role in addressing the progression and visual impairment caused by cataracts, the leading cause of blindness in Thailand and many other countries. Despite the potential for prevention and successful treatment, patients often delay seeking medical attention due to the gradual and relatively asymptomatic nature of cataracts. To address this challenge, this research focuses on the identification of cataract abnormalities using image processing techniques and machine learning for preliminary assessment. The LeNet-convolutional neural network (LeNet-CNN) model is employed to train a dataset of digital camera images, and its performance is compared to the support vector machine (SVM) model in categorizing cataract abnormalities. The evaluation demonstrates that the LeNet-CNN model achieves impressive results in the testing phase. It attains an accuracy rate of 96%, exhibiting a sensitivity of 95% for detecting positive cases and a specificity of 96% for accurately identifying negative cases. These outcomes surpass the performance of previous studies in this field. This highlights the accuracy and effectiveness of the proposed approach, particularly the superior performance of LeNet-CNN. By utilizing image processing technology and convolutional neural networks, this research provides an effective tool for initial cataract screening. Patients can independently assess their eye health by capturing self-images, facilitating early intervention and medical consultations. The proposed method holds promise in enhancing the preliminary assessment of cataracts, enabling early detection and timely access to appropriate care.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.