Abstract
Unsupervised domain adaptation (UDA) aims to alleviate domain shifts by transferring relevant domain information from a fully labeled source domain to an unknown target domain. Although significant progress has been made recently, the use of UDA in real-world applications is still limited owing to low-resource computers and privacy issues. To further extend the flexibility of the target model, this study attempts to address a new setting of UDA, where only a source model is provided, and target models with various network architectures can be learned according to the deployment environment. Therefore, we propose a novel two-stage Cross-domain Knowledge Distillation method via Teacher–Student Mutual Learning, termed CdKD-TSML, which enables the peer networks to employ pseudo labels assigned by one another as supplemental supervision. In the first stage, to depress the inevitable noise in hard pseudo labels generated by the self-supervised clustering procedure, we further propose to softly refine the pseudo labels with mutual learning, where networks and label predictions are online optimized cooperatively by distilling knowledge from each other. In the second stage, we jointly fine-tune the distilled models via a cooperative consistency learning strategy, which selects pseudo-labeled samples from the two peers to update the networks, thereby enhancing the generalization of the models. We theoretically demonstrate that the target error of CdKD-TSML is minimized by simultaneously decreasing the pseudo-label noise and alleviating the sample selection bias. Experiments on three challenging UDA datasets demonstrate that CdKD-TSML yields superior results compared to other state-of-the-art methods, proving its effectiveness in this novel setting.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.