Abstract

SummaryBackgroundDetailed evaluation of bile duct (BD) is main focus during endoscopic ultrasound (EUS). The aim of this study was to develop a system for EUS BD scanning augmentation.MethodsThe scanning was divided into 4 stations. We developed a station classification model and a BD segmentation model with 10681 images and 2529 images, respectively. 1704 images and 667 images were applied to classification and segmentation internal validation. For classification and segmentation video validation, 264 and 517 videos clips were used. For man-machine contest, an independent data set contained 120 images was applied. 799 images from other two hospitals were used for external validation. A crossover study was conducted to evaluate the system effect on reducing difficulty in ultrasound images interpretation.FindingsFor classification, the model achieved an accuracy of 93.3% in image set and 90.1% in video set. For segmentation, the model had a dice of 0.77 in image set, sensitivity of 89.48% and specificity of 82.3% in video set. For external validation, the model achieved 82.6% accuracy in classification. In man-machine contest, the models achieved 88.3% accuracy in classification and 0.72 dice in BD segmentation, which is comparable to that of expert. In the crossover study, trainees’ accuracy improved from 60.8% to 76.3% (P < 0.01, 95% C.I. 20.9–27.2).InterpretationWe developed a deep learning-based augmentation system for EUS BD scanning augmentation.FundingHubei Provincial Clinical Research Center for Digestive Disease Minimally Invasive Incision, Hubei Province Major Science and Technology Innovation Project, National Natural Science Foundation of China.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.