Abstract

Unsupervised domain adaptation (UDA), which is used to alleviate the domain shift between the source domain and target domain, has attracted substantial research interest. Previous studies have proposed effective UDA methods which require both labeled source data and unlabeled target data to achieve desirable distribution alignment. However, due to privacy concerns, the vendor side often can only trade the pretrained source model without providing the source data to the targeted client, leading to failed adaptation by classical UDA techniques. To address this issue, in this paper, a novel Superpixel-guided Class-level Denoised self-training framework (SCD) is proposed, aiming at effectively adapting the pretrained source model to the target domain in the absence of source data. Since the source data is unavailable, the model can only be trained on the target domain with the pseudo labels obtained from the pretrained source model. However, due to domain shift, the predictions obtained by the source model on the target domain are noisy. Considering this, we propose three mutual-reinforcing components tailored to our self-training framework: (i) an adaptive class-aware thresholding strategy for more balanced pseudo label generation, (ii) a masked superpixel-guided clustering method for generating multiple content-adaptive and spatial-adaptive feature centroids that enhance the discriminability of final prototypes for effective prototypical label denoising, and (iii) adaptive learning schemes for suspected noisy-labeled and correct-labeled pixels to effectively utilize the valuable information available. Comprehensive experiments on multi-site fundus image segmentation demonstrate the superior performance of our approach and the effectiveness of each component.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.