Abstract

Domain adaptation (DA) aims to generalize a learning model across training and testing data despite the mismatch of their data distributions. In light of a theoretical estimation of the upper error bound, we argue, in this article, that an effective DA method for classification should: 1) search a shared feature subspace where the source and target data are not only aligned in terms of distributions as most state-of-the-art DA methods do but also discriminative in that instances of different classes are well separated and 2) account for the geometric structure of the underlying data manifold when inferring data labels on the target domain. In comparison with a baseline DA method which only cares about data distribution alignment between source and target, we derive three different DA models for classification, namely, close yet discriminative DA (CDDA), geometry-aware DA (GA-DA), and discriminative and GA-DA (DGA-DA), to highlight the contribution of CDDA based on 1), GA-DA based on 2), and, finally, DGA-DA implementing jointly 1) and 2). Using both the synthetic and real data, we show the effectiveness of the proposed approach which consistently outperforms the state-of-the-art DA methods over 49 image classification DA tasks through eight popular benchmarks. We further carry out an in-depth analysis of the proposed DA method in quantifying the contribution of each term of our DA model and provide insights into the proposed DA methods in visualizing both real and synthetic data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.