Abstract

Multiple kernel learning (MKL) is a principled way for kernel fusion for various learning tasks, such as classification, clustering and dimensionality reduction. In this paper, we develop a novel multiple kernel learning model based on the Hilbert-Schmidt independence criterion (HSIC) for classification (called HSIC-MKL). In the proposed HSIC-MKL model, we first propose a HSIC Lasso-based MKL formulation, which not only has a clear statistical interpretation that minimum redundant kernels with maximum dependence on output labels are found and combined, but also the global optimal solution can be computed efficiently by solving a Lasso optimization problem. After the optimal kernel is obtained, the support vector machine (SVM) is used to select the prediction hypothesis. It is evident that the proposed HSIC-MKL is a two-stage kernel learning approach. Extensive experiments on real-world data sets from UCI benchmark repository validate the superiority of the proposed model in terms of prediction accuracy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call