Abstract

In this work, we propose a novel multi-task learning framework which has better generalization ability and robustness compared to conventional multi-task methods. In the proposed approach, the downstream task-related information is extracted first from the original input by using a stochastic global representation. Afterwards, the task-specific representations is generated for learning downstream tasks. Moreover, the disentanglement between the task-specific representations are taken into account by using mutual information minimization to learn better representations. For computing the downstream losses, we take the homoscedastic task-related uncertainties into account to balance the downstream task losses. Finally, we test our methods on the public available dataset. The experimental results suggest that the proposed approach has better performance in terms of generalization ability and robustness compared to the existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call