Abstract

Transfer learning is designed to leverage knowledge in the source domain with labels to help build classification models in the target domain where labels are scarce or even unavailable. Previous studies have shown that high-level concepts extracted from original features are more suitable for cross-domain classification tasks, so many transfer learning methods transfer knowledge by modeling high-level concepts on the original feature space. However, there are two limitations to this method: First, learning high-level concepts directly on the original feature space will reduce the proportion of shared information contained in common features in the process of knowledge transfer bridge construction. Second, only learning multiple high-level concepts on the original feature space, the latent shared information contained in the domain-specific features cannot be targeted learned, so the latent shared information in the domain-specific features cannot be effectively used. To overcome these limitations, this paper proposes a novel method named Dual-Space Transfer Learning based on an Indirect Mutual Promotion Strategy (DSTL). The DSTL method is formalized as an optimization problem based on non-negative matrix tri-factorization. DSTL first extracts the common features between domains and constructs the common feature space. Then, the learning of the high-level concepts of the common feature space and the original feature space is integrated through an indirect promotion strategy, which can enhance the learning effect of common features and domain-specific features through the mutual help of the two feature spaces. The system test on benchmark data sets shows the superiority of the DSTL method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call