Abstract

Unsupervised domain adaptation aims to leverage knowledge from a labeled source domain to learn an accurate model in an unlabeled target domain. However, many previous approaches propose to learn domain agnostic feature representations using a global distribution alignment objective, which does not consider the fine-grained cluster structures in the source and target domains. As such, the goal of this paper is to address two challenging problems: <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">1) how to thoroughly explore fine-grained cluster structure knowledge in the source and target domains, 2) how to effectively incorporate these structure knowledge for adaptation.</i> Regarding the first point, we are motivated by structural domain similarity assumption and propose structural representation learning, which is achieved by enforcing structural consistency between the source and target domains while retaining their individual discriminative properties. Regarding the second point, we firstly devise a novel structural centroid-based label prediction method, which explicitly models structural representations to form discriminative source and target cluster centroids, and estimates the label distribution of each target sample through the cosine similarity between its corresponding target cluster centroid and all the other source cluster centroids. Then, we adopt clustering learning to incorporate these discriminative structure knowledge for adaptation by minimizing the KL divergence between the predictive target label distribution and an introduced auxiliary one. Comprehensive experiments and analyses on four benchmark datasets demonstrate the superiority of the proposed discriminative clustering framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call