Abstract

In this paper, we study the problem of multilinear multitask learning (MLMTL), in which all tasks are stacked into a third-order tensor for consideration. In contrast to conventional multitask learning, MLMTL can explore inherent correlations among multiple tasks in a better manner by utilizing multilinear low rank structure. Existing approaches about MLMTL are mainly based on the sum of singular values for approximating low rank matrices obtained by matricizing the third-order tensor. However, these methods are suboptimal in the Tucker rank approximation. In order to elucidate intrinsic correlations among multiple tasks, we present a new approach by the use of transformed tensor nuclear norm (TTNN) constraint in the objective function. The main advantage of the proposed approach is that it can acquire a low transformed multi-rank structure in a transformed tensor by applying suitable unitary transformations which is helpful to determine principal components in grouping multiple tasks for describing their intrinsic correlations more precisely. Furthermore, we establish an excess risk bound of the minimizer of the proposed TTNN approach. Experimental results including synthetic problems and real-world images, show that the mean-square errors of the proposed method is lower than those of the existing methods for different number of tasks and training samples in MLMTL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call