Abstract

Domain Adaptation aims at adapting the knowledge learned from a domain (source-domain) to another (target-domain). Existing approaches typically require a portion of task-relevant target-domain data a priori. We propose an approach, zero-shot deep domain adaptation (ZDDA), which uses paired dual-domain task-irrelevant data to eliminate the need for task-relevant target-domain training data. ZDDA learns to generate common representations for source and target domains data. Then, either domain representation is used later to train a system that works on both domains or having the ability to eliminate the need to either domain in sensor fusion settings. Two variants of ZDDA have been developed: ZDDA for classification task (ZDDA-C) and ZDDA for metric learning task (ZDDA-ML). Another limitation in Existing approaches is that most of them are designed for the closed-set classification task, i.e., the sets of classes in both the source and target domains are "known." However, ZDDA-C is also applicable to the open-set classification task where not all classes are "known" during training. Moreover, the effectiveness of ZDDA-ML shows ZDDA's applicability is not limited to classification tasks. ZDDA-C and ZDDA-ML are tested on classification and metric-learning tasks, respectively. Under most experimental conditions, ZDDA outperforms the baseline without using task-relevant target-domain-training data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.