Abstract
BackgroundThinning of the choroid has been linked with various ocular diseases, including high myopia (HM), which can lead to visual impairment. Although various artificial intelligence (AI) algorithms have been developed to quantify choroidal thickness (ChT), few patients with HM were included in their development. The choroid in patients with HM tends to be thinner than that of normal patients, making it harder to segment. Therefore, in this study, we aimed to develop and implement a novel deep learning algorithm based on a group-wise context selection network (GCS-Net) to automatically segment the choroid and quantify its thickness on swept-source optical coherence tomography (SS-OCT) images of HM patients.MethodsA total of 720 SS-OCT images were obtained from 40 HM eyes and 20 non-HM eyes and were used to develop a GCS-Net to segment the choroid. The intersection-over-union (IoU), Dice similarity coefficient (DSC), sensitivity, and specificity were used to assess the performance in relation to manually segmented ground truth. The independent test dataset included 3,192 images from 266 HM eyes. The ChT in the test dataset was measured manually and automatically at 9 different regions within the choroid. The average difference in the ChT between the 2 methods was calculated. The intraclass correlation coefficient (ICC) was calculated to evaluate the agreement between the 2 measurements.ResultsOur method reached an IoU, DSC, sensitivity, and specificity of 87.89%, 93.40%, 92.42%, and 99.82% in HM, respectively. The average difference in the ChT between the 2 measurements was 5.54±4.57 µm. The ICC was above 0.90 (P<0.001) for all regions of the choroid, indicating a very high level of agreement.ConclusionsThe GCS-Net proposed in our study provides a reliable and fast tool to quantify ChT in HM patients and could potentially be used as a tool for monitoring ChT in ocular diseases related to the choroid.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.