Abstract

Pseudo-labeling is a well-studied approach in semi-supervised learning. However, unreliable or potentially incorrect pseudo-labels can accumulate training errors during iterative self-training steps, leading to unstable performance. Addressing this challenge typically involves either discarding unreliable pseudo-labels, resulting in the loss of important data, or attempting to refine them, risking the possibility of worsening the pseudo-labels in some cases/pixels. In this paper, we propose a novel method based on pseudo-labeling for semi-supervised segmentation of medical images. Unlike existing approaches, our method neither discards any data nor worsens reliable pseudo-labels. Our approach generates uncertainty masks for the predictions, utilizing reliable pixels without any modification as ground truths and modifying the unreliable ones rather than discarding them. Furthermore, we introduce a novel loss function that incorporates both mentioned parts by multiplying each term by its corresponding uncertainty mask, encompassing reliable and unreliable pixels. The reliable pixels are addressed using a masked cross-entropy loss function, while the modification of the unreliable pixels is performed through a deep-learning-based adaptation of active contours. The entire process is solved within a single loss function without the need to solve traditional active contour equations. We evaluated our approach on three publicly available datasets, including MRI and CT images from cardiac structures and lung tissue. Our proposed method outperforms the state-of-the-art semi-supervised learning methods on all three datasets. Implementation of our work is available at https://github.com/behnam-rahmati/Semi-supervised-medical.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.