Abstract

The automatic segmentation of polyp in endoscopy images is crucial for early diagnosis and cure of colorectal cancer. Existing deep learning-based methods for polyp segmentation, however, are inadequate due to the limited annotated dataset and the class imbalance problems. Moreover, these methods obtained the final polyp segmentation results by simply thresholding the likelihood maps at an eclectic and equivalent value (often set to 0.5). In this paper, we propose a novel ThresholdNet with a confidence-guided manifold mixup (CGMMix) data augmentation method, mainly for addressing the aforementioned issues in polyp segmentation. The CGMMix conducts manifold mixup at the image and feature levels, and adaptively lures the decision boundary away from the under-represented polyp class with the confidence guidance to alleviate the limited training dataset and the class imbalance problems. Two consistency regularizations, mixup feature map consistency (MFMC) loss and mixup confidence map consistency (MCMC) loss, are devised to exploit the consistent constraints in the training of the augmented mixup data. We then propose a two-branch approach, termed ThresholdNet, to collaborate the segmentation and threshold learning in an alternative training strategy. The threshold map supervision generator (TMSG) is embedded to provide supervision for the threshold map, thereby inducing better optimization of the threshold branch. As a consequence, ThresholdNet is able to calibrate the segmentation result with the learned threshold map. We illustrate the effectiveness of the proposed method on two polyp segmentation datasets, and our methods achieved the state-of-the-art result with 87.307% and 87.879% dice score on the EndoScene dataset and the WCE polyp dataset. The source code is available at https://github.com/Guo-Xiaoqing/ThresholdNet.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.