Abstract

In multi-task learning (MTL), multiple related tasks can be learned simultaneously under the shared information to improve the generalization performance. However, most of MTL methods assume that all the learning tasks are related indeed and appropriate for joint learning. In some real situations, this assumption may not hold and further lead to the problem of negative transfer. Therefore, in this paper, we not only focus on researching the problem of robustly learning the common feature structure shared by tasks, but also discriminate with which tasks one task should share. By combining with the idea of subspace learning, we propose an elaborate multi-task subspace learning model (EMTSL) with discrete group structure constraint, which can cluster the learned tasks into a set of groups. By introducing the Schatten p-norm instead of trace norm, our model EMTSL can better approximate the low-rank constraint and also avoid the trivial solution. Furthermore, we design an efficient algorithm based on the re-weighted method to solve the proposed model. In addition, the convergence analysis of our algorithm is given in this paper. Experimental results on both synthetic and real-world datasets demonstrate the superiority of our method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call