Abstract

Multitask learning (MTL) aims to learn multiple related tasks simultaneously instead of separately to improve the generalization performance of each task. Most existing MTL methods assumed that the multiple tasks to be learned have the same feature representation. However, this assumption may not hold for many real-world applications. In this paper, we study the problem of MTL with heterogeneous features for each task. To address this problem, we first construct an integrated graph of a set of bipartite graphs to build a connection among different tasks. We then propose a non-negative matrix factorization-based multitask method (MTNMF) to learn a common semantic feature space underlying different heterogeneous feature spaces of each task. Moreover, an improved version of MTNMF (IMTNMF) is proposed, in which we do not need to construct the correlation matrix between input features and class labels, avoiding the information loss. Finally, based on the common semantic features and original heterogeneous features, we model the heterogenous MTL problem as a multitask multiview learning (MTMVL) problem. In this way, a number of existing MTMVL methods can be applied to solve the problem effectively. Extensive experiments on three real-world problems demonstrate the effectiveness of our proposed methods, and the improved version IMTNMF can gain about 2% average accuracy improvement compared with MTNMF.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.