Abstract

Background: The availability of numerous fundus photographs and optical coherence tomography images has allowed the use of artificial intelligence (AI) in ophthalmology for detecting retinal and optic nerve disorders. Despite the importance of anterior segment screening and cataract diagnosis before fundus examination, the use of AI remains limited in anterior segment examination as well as cataract detection due to the lack of standardization images and analysis models. In this study, we investigated whether our multiple machine-learning methods could diagnose cataract based on videos recorded using a slit-lamp device. Methods: A dataset comprising 18,596 frames collected retrospectively were used for training and cross-validation of a machine-learning algorithm. Cataract diagnosis, cataract severity grading, and surgical indication prediction between our model and evaluations performed by ophthalmologists were assessed with kappa statistics. Findings: Our model could diagnose cataract with sensitivity, 99·60% (95% confidence interval [CI], 99·40–99·70) and specificity, 96·00% (95% CI, 83·40–99·30) compared to the diagnosis of the ophthalmologists. The results of individual cataract grading were for nuclear cataract (NUC)0: AUC, 0·987 (95% CI, 0·947–1·000); NUC1: AUC, 0·916 (95% CI, 0·888–0·945); NUC2: AUC, 0·862 (95% CI, 0·844–0·879); and NUC3: AUC, 0·943 (95% CI, 0·931–0·955). For overall cataract grading, the accuracy was 87·8% (kappa coefficient, 0·811 (95% CI, 0·791–0·831). The AI predicted the surgical indication with a sensitivity of 91·80% (95% CI, 82·00-95·10) and specificity of 92·30% (95% CI, 86·10-94·40). The cross-validation accuracy was 86·0% (kappa coefficient, 0·800 [95% CI, 0·780–0·820]). Interpretation: We successfully created a high-performance cataract diagnostic model using multiple machine-learning methods from videos recorded by the slit-lamp device, which can simplify and expedite the diagnostic process. Trial Registration: UMIN-CTR Clinical Trial number (UMI40321) Funding Statement: This work was supported by the Japan Agency for Medical Research and Development (20he1022003h0001 and 20hk0302008h0001), Uehara Memorial Foundation, Hitachi Global Foundation, and Kondo Memorial Foundation. No other funding statement to declare including company and patent. Declaration of Interests: H.Y., E.S., and N.A. are the co-founders of OUI Inc. and own stock of OUI Inc. OUI Inc. has the patent for the Smart Eye Camera (the publication of Japanese Patent No. 6627071, Tokyo, Japan). There are no other relevant declarations related to this patent. The other authors declare no competing interest associated with this manuscript. OUI Inc. did not have any additional role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript. Ethics Approval Statement: This study adhered to the tenets of the Declaration of Helsinki and was conducted in compliance with the protocols (UMI40321, Table S1) approved by the Institutional Ethics Review Board of Keio University School of Medicine, Tokyo, Japan (IRB No. 20090277, 20170306, 20180206, and 20200021) and Tsurumi University School of Dental Medicine, Kanagawa, Japan (IRB Number, 1634, Table S2). Written informed consent was obtained from the maximal number of study participants for data collection, which confirmed secondary use and/or provision to a third party for further use. However, the requirement for informed consent was waived for some cases because of the retrospective study design and use of de-identified data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.