Abstract
Multi-task learning (MTL) is a joint learning paradigm to improve the generalization performance of the tasks. At present, most of MTL methods are all based on one hypothesis that all learning tasks are related and approximate for joint learning. However, this hypothesis may not be held in some scenarios, which may further lead to the problem of negative transfer. Therefore, in this paper, we aim to deal with the negative transfer problem and simultaneously improve the generalization performance in the joint learning. Combining with the subspace learning, we proposed a calibrated multi-task subspace learning method (CMTSL) under the binary group constraint. With the low-rank constraint on subspaces and the binary group indicator, our model can identify “with whom” one task should share and perform the multi-task inference on the high-dimensional parameter space in the meantime. To better approximate the low-rank constraint, we introduce a capped rank function as the tight relaxation term. Last, an iteration based re-weighted algorithm is proposed to solve our model and the convergence analysis is also proved in theory. Experimental results on benchmark datasets demonstrate the superiority of our model.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.