Abstract

In this article, we study a tensor-based multitask learning (MTL) method for classification. Taking into account the fact that in many real-world applications, the given training samples are limited and can be inherently arranged into multidimensional arrays (tensors), we are motivated by the advantages of MTL, where the shared structural information among related tasks can be leveraged to produce better generalization performance. We propose a regularized tensor-based MTL method for joint feature selection and classification. For feature selection, we employ the Fisher discriminant criterion to both select discriminative features and control the within-class nonstationarity. For classification, we take both shared and task-specific structural information into consideration. We decompose the regression tensor for each task into a linear combination of a shared tensor and a task-specific tensor and propose a composite tensor norm. Specifically, we use the scaled latent trace norm for regularizing the shared tensor and the l1 -norm for task-specific tensor. Further, we give a computationally efficient optimization algorithm based on the alternating direction method of multipliers (ADMMs) to tackle the joint learning of discriminative features and multitask classification. The experimental results on real electroencephalography (EEG) datasets demonstrate the superiority of our method over the state-of-the-art techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call