Abstract

Accurate abnormality classification in Wireless Capsule Endoscopy (WCE) images is crucial for early gastrointestinal (GI) tract cancer diagnosis and treatment, while it remains challenging due to the limited annotated dataset, the huge intra-class variances and the high degree of inter-class similarities. To tackle these dilemmas, we propose a novel semi-supervised learning method with Adaptive Aggregated Attention (AAA) module for automatic WCE image classification. Firstly, a novel deformation field based image preprocessing strategy is proposed to remove the black background and circular boundaries in WCE images. Then we propose a synergic network to learn discriminative image features, consisting of two branches: an abnormal regions estimator (the first branch) and an abnormal information distiller (the second branch). The first branch utilizes the proposed AAA module to capture global dependencies and incorporate context information to highlight the most meaningful regions, while the second branch mainly focuses on these calculated attention regions for accurate and robust abnormality classification. Finally, these two branches are jointly optimized by minimizing the proposed discriminative angular (DA) loss and Jensen-Shannon divergence (JS) loss with labeled data as well as unlabeled data. Comprehensive experiments have been conducted on the public CAD-CAP WCE dataset. The proposed method achieves 93.17% overall accuracy in a fourfold cross-validation, verifying its effectiveness for WCE image classification. The source code is available at https://github.com/Guo-Xiaoqing/SSL_WCE.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.