Abstract

Multitask learning (MTL) aims to learn multiple related tasks simultaneously instead of separately to improve the generalization performance of each task. Most existing MTL methods assumed that the multiple tasks to be learned have the same feature representation. However, this assumption may not hold for many real-world applications. In this paper, we study the problem of MTL with heterogeneous features for each task. To address this problem, we first construct an integrated graph of a set of bipartite graphs to build a connection among different tasks. We then propose a non-negative matrix factorization-based multitask method (MTNMF) to learn a common semantic feature space underlying different heterogeneous feature spaces of each task. Moreover, an improved version of MTNMF (IMTNMF) is proposed, in which we do not need to construct the correlation matrix between input features and class labels, avoiding the information loss. Finally, based on the common semantic features and original heterogeneous features, we model the heterogenous MTL problem as a multitask multiview learning (MTMVL) problem. In this way, a number of existing MTMVL methods can be applied to solve the problem effectively. Extensive experiments on three real-world problems demonstrate the effectiveness of our proposed methods, and the improved version IMTNMF can gain about 2% average accuracy improvement compared with MTNMF.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call