Abstract

ABSTRACT Purpose To report a rapid and accurate method based upon deep learning for automatic segmentation and measurement of the choroidal thickness (CT) in myopic eyes, and to determine the relationship between refractive error (RE) and CT. Methods Fifty-four healthy subjects 20–39 years of age were retrospectively reviewed. Data reviewed included age, gender, laterality, visual acuity, RE, and Enhanced Depth Imaging Optical Coherence Tomography (EDI-OCT) images. The choroid layer was labeled by manual and automatic method using EDI-OCT. A Mask Region-convolutional Neural Network (Mask R-CNN) model, using deep Residual Network (ResNet) and Feature Pyramid Networks (FPN) as a backbone network, was trained to automatically outline and quantify the choroid layer. Results ResNet 50 model was adopted for its 90% accuracy rate and 6.97 s average execution time. CT determined by the manual method had a mean thickness of 258.75 ± 66.11 µm, a positive correlation with RE (r = 0.596, p < .01) and significant association with gender (p = .011) and RE (p < .001) in multivariable linear regression analysis. Meanwhile, CT determined by deep learning presented a mean thickness of 226.39 ± 54.65 µm, a positive correlation with RE (r = 0.546, p < .01) and significant association with gender (p = .043) and RE (p < .001) in multivariable linear regression analysis. Both methods revealed that CT decreased with the increase in myopic RE. Conclusions This deep learning method using Mask-RCNN was able to successfully determine the relationship between RE and CT in an accurate and rapid way. It could eliminate the need for manual process, while demonstrating a feasible clinical application.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.