Abstract

With the boom in machine learning, support vector machine (SVM) is widely employed in pattern recognition. However, most of SVM models concentrate on single-task learning, multi-task learning has been largely neglected. Compared with single-task learning, multi-task learning can achieve a good performance for each task by mining the shared information among tasks. In addition, loss function also plays an important role in the accuracy of SVM. Inspired by multi-task learning and the SVM with pinball loss (pin-SVM), we propose two novel multi-task support vector machines with pinball loss for binary classification, named as MTL-pin-SVM I and MTL-pin-SVM II. Both methods maximize the quantile distance for each task, which realizes less sensitive to noise and more stable for re-sampling. Moreover, MTL-pin-SVM II can use different combinations of kernel functions for different tasks, which can get better performance than other multi-task models by choosing the suitable combinations of kernel functions for different tasks. And they include the multi-task SVM with hinge loss as their special cases, which are denoted as MTL-C-SVM I and MTL-C-SVM II. The extensive experiments on multi-task datasets fully validate the validity of the proposed models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call