Abstract
EUS is a high-skill technique that requires numerous procedures to achieve competence. However, training facilities are limited worldwide. Convolutional neural network (CNN) models have been previously implemented for object detection. We developed 2 EUS-based CNN models for normal anatomic structure recognition during real-time linear- and radial-array EUS evaluations. The study was performed from February 2020 to June 2022. Consecutive patient videos of linear- and radial-array EUS videos were recorded. Expert endosonographers identified and labeled 20 normal anatomic structures within the videos for training and validation of the CNN models. Initial CNN models (CNNv1) were developed from 45 videos and the improved models (CNNv2) from an additional 102 videos. CNN model performance was compared with that of 2 expert endosonographers. CNNv1 used 45,034 linear-array EUS frames and 21,063 radial-array EUS frames. CNNv2 used 148,980 linear-array EUS frames and 128,871 radial-array EUS frames. Linear-array CNNv1 and radial-array CNNv1 achieved a 75.65% and 71.36% mean average precision (mAP) with a total loss of .19 and .18, respectively. Linear-array CNNv2 obtained an 88.7% mAP with a .06 total loss, whereas radial-array CNNv2 achieved an 83.5% mAP with a .07 total loss. CNNv2 accurately detected all studied normal anatomic structures with a >98% observed agreement during clinical validation. The proposed CNN models accurately recognize the normal anatomic structures in prerecorded videos and real-time EUS. Prospective trials are needed to evaluate the impact of these models on the learning curves of EUS trainees.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.