Abstract
Benign ulcerative colorectal diseases (UCDs) such as ulcerative colitis, Crohn's disease, ischemic colitis, and intestinal tuberculosis share similar phenotypes with different etiologies and treatment strategies. To accurately diagnose closely related diseases like UCDs, we hypothesize that contextual learning is critical in enhancing the ability of the artificial intelligence models to differentiate the subtle differences in lesions amidst the vastly divergent spatial contexts. White-light colonoscopy datasets of patients with confirmed UCDs and healthy controls were retrospectively collected. We developed a Multiclass Contextual Classification (MCC) model that can differentiate among the mentioned UCDs and healthy controls by incorporating the tissue object contexts surrounding the individual lesion region in a scene and spatial information from other endoscopic frames (video-level) into a unified framework. Internal and external datasets were used to validate the model's performance. Training datasets included 762 patients, and the internal and external testing cohorts included 257 patients and 293 patients, respectively. Our MCC model provided a rapid reference diagnosis on internal test sets with a high averaged area under the receiver operating characteristic curve (image-level: 0.950 and video-level: 0.973) and balanced accuracy (image-level: 76.1% and video-level: 80.8%), which was superior to junior endoscopists (accuracy: 71.8%, P < .0001) and similar to experts (accuracy: 79.7%, P= .732). The MCC model achieved an area under the receiver operating characteristic curve of 0.988 and balanced accuracy of 85.8% using external testing datasets. These results enable this model to fit in the routine endoscopic workflow, and the contextual framework to be adopted for diagnosing other closely related diseases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.