Abstract
Background and aimsThe integrity of image acquisition is critical for biliopancreatic Endoscopic Ultrasonography (EUS) reporting, significantly affecting the quality of EUS examinations and disease-related decision-making. However, the quality of EUS reports varies among endoscopists. To address this, we developed a deep learning-based EUS automatic image reporting system (EUS-AIRS), aiming to achieve automatic photodocumentation in real-time during EUS, including capturing standard stations, lesions, and puncture procedures. MethodsEight deep learning models trained and tested using 235,784 images were integrated to construct the EUS-AIRS. We tested the performance of EUS-AIRS through man-machine comparison at two levels: retrospective test (include internal and external test), and prospective test. From May 2023 to October 2023, 114 patients undergoing EUS at Renmin Hospital of Wuhan University were consecutively recruited for prospective test. The primary outcome was the completeness of the EUS-AIRS for capturing standard stations. ResultsIn terms of completeness in capturing biliopancreatic standard stations, EUS-AIRS exceeds the capabilities of endoscopists at all levels of expertise in retrospective internal (90.8% [95%CI 88.7%-92.9%] vs. 70.5% [95%CI 67.2%-73.8%], p<0.001), and external test (91.4% [95%CI 88.4%-94.4%] vs 68.2% [95%CI 63.3%-73.2%], p<0.001). EUS-AIRS demonstrated high accuracy and completeness in capturing standard station images. The completeness significantly outperformed manual endoscopist reports: 91.4% [95%CI, 89.4% - 93.4%] vs. 78.1% [95%CI, 75.1% - 81.0%), p<0.001. ConclusionsEUS-AIRS exhibits exceptional capabilities in real-time capturing high-quality and high-integrity biliopancreatic EUS images, showcasing the potential of applying an artificial intelligence image reporting system in the EUS field.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have