Abstract

Domain adaptation (DA) has recently drawn a lot of attention, as it facilitates unlabeled target learning by borrowing knowledge from an external source domain. Most existing DA solutions seek to align feature representations between the labeled source and unlabeled target data. However, the scarcity of target data easily results in negative transfer, as it misleads the cross DA to the dominance of the source. To address the challenging few-shot domain adaptation (FSDA) problem, in this article, we propose a novel marginalized augmented FSDA (MAF) approach to address the cross-domain distribution disparity and insufficiency of target data simultaneously. On the one hand, cross-domain continuity augmentation (CCA) synthesizes abundant intermediate patterns across domains leading to a continuous domain-invariant latent space. On the other hand, sufficient source-supervised semantic augmentation (SSA) is explored to progressively diversify the conditional distribution within and across domains. Moreover, the proposed augmentation strategies are implemented efficiently via an expected transferable cross-entropy (CE) loss over the augmented distribution instead of explicit data synthesis, and minimizing the upper bound of the expected loss introduces negligible extra computing cost. Experimentally, our method outperforms the state of the art in various FSDA benchmarks, which demonstrates the effectiveness and contribution of our work. Our source code is provided at https://github.com/scottjingtt/MAF.git.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call