Abstract

Endoscopic ultrasonography (EUS) is commonly utilized in preoperative staging of esophageal cancer, however with additional pain and cost as well as adverse events. Meanwhile, the accuracy of EUS is highly depend on the training and practice of operators and not universally available. Different operators would lead to high inter-observer variability. Therefore, it is desirable to explore an alternative way to determine preoperative T stage in esophageal cancer. Whether conventional endoscopy possess the ability to predict EUS T stage has never been investigated yet. In current study, with the assistance of Artificial intelligence, we have developed a deep learning model to predict EUS T stage based on 9,714 images collected from 3,333 patients. ResNet-152 pre-trained on the ImageNet dataset was trained with the appropriate transfer learning and fine-tuning strategies on the conventional endoscopic images and their corresponding labels (e.g., T1, T2, T3, T4 and Normal). Meanwhile, augmentation strategies including rotation and flipping were performed to increase the number of images to improve the prediction accuracy. Finally, 4,382 T1, 243 T2, 3,985 T3, 1,102 T4, 14,302 controls images were obtained and split into training dataset, validation dataset and independent testing dataset with the ratio of 4:1:1. Our model could achieve a satisfied performance with an area under the receiver-operating curve (AUC) were 0.9767, 0.9637, 0.9597 and 0.9442 for T1, T2, T3 and T4, respectively in independent testing dataset. In conclusion, conventional gastroscopy combined with artificial intelligence have the great potential to predict EUS T stage.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.