Abstract

Automated skin lesion segmentation is an essential yet challenging task for computer-aided skin disease diagnosis. One major challenge for learning-based segmentation method is the limited manually annotated dermoscopy images. Many semi-supervised methods are proposed to exploit unlabeled data by self-training with pseudo labels. However, the plain pseudo labels are less accurate and the pixel-wise features of unlabeled data are always not well formulated due to the large variations among different lesions. Aiming at producing a good segmentation embedding space in a semi-supervised manner, in this paper, we propose a novel dynamic prototypical feature representation learning framework to address these problems. Specifically, we propose a novel denoised pseudo label generation method, which effectively filters out the unreliable components in plaint pseudo labels and provides the guidance for the subsequent feature representation learning. Then, we propose a memory relation learning method to enhance the intermediate feature representation globally. Additionally, we propose a prototype-based confidence-aware contrastive learning method to learn a better local feature structure in semi-supervised training, strengthening intra-class compactness and inter-class separability. Extensive experiments on two skin lesion segmentation datasets demonstrate that our method outperforms other popular semi-supervised segmentation methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.