Abstract

Transfer learning aims to apply previously learned knowledge to new unknown domains by mining the potential relationships between data from different domains. Because of its ability to process data from different domains, the projection transfer learning methods have attracted much attention. However, most of the methods only focus on the reconstruction between different domains and do not fully use the data. Here, we put forward a novel transfer learning method called low-rank constraint-based multiple projections learning (LRMPL) for cross-domain classification. LRMPL works on both the overall reconstruction of two domains and aligns the data sharing the same label but from different domains by imposing low-rank constraints on the reconstruction coefficient matrix. In this way, the obtained feature representation is transferrable and discriminative. Moreover, LRMPL considers the information preserving during the feature representation learning by integrating a modified PCA-like item into the objective function. To learn a suitable classifier parameter and feature representation, LRMPL also unifies the classifier and feature learning into a single optimization objective. Extensive experiments on several benchmark datasets show LRMPL outperforms the existing traditional transfer learning methods in cross-domain classification. The demo code is available in https://github.com/892384750/LRMPL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call