Abstract

Pancreatic cancer does not show specific symptoms, which makes the diagnosis of early stages difficult with established image-based screening methods and therefore has the worst prognosis among all cancers. Although endoscopic ultrasonography (EUS) has a key role in diagnostic algorithms for pancreatic diseases, B-mode imaging of the pancreas can be affected by confounders such as chronic pancreatitis, which can make both pancreatic lesion segmentation and classification laborious and highly specialized. To address these challenges, this work proposes a semi-supervised multi-task network (SSM-Net) to leverage unlabeled and labeled EUS images for joint pancreatic lesion classification and segmentation. Specifically, we first devise a saliency-aware representation learning module (SRLM) on a large number of unlabeled images to train a feature extraction encoder network for labeled images by computing a contrastive loss with a semantic saliency map, which is obtained by our spectral residual module (SRM). Moreover, for labeled EUS images, we devise channel attention blocks (CABs) to refine the features extracted from the pre-trained encoder on unlabeled images for segmenting lesions, and then devise a merged global attention module (MGAM) and a feature similarity loss (FSL) for obtaining a lesion classification result. We collect a large-scale EUS-based pancreas image dataset (LS-EUSPI) consisting of 9,555 pathologically proven labeled EUS images (499 patients from four categories) and 15,500 unlabeled EUS images. Experimental results on the LS-EUSPI dataset and a public thyroid gland lesion dataset show that our SSM-Net clearly outperforms state-of-the-art methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.