Abstract

In many applications of kernel method, such as support vector machine (SVM), the performance greatly depends on the choice of kernel. It is not always clear what is the most suitable kernel for one specified task. Multiple kernel learning (MKL) allows to optimize over linear combinations of kernels. A two-step approach, which alternately optimizes the standard SVM and kernel weights, is proposed to solve the non-convex MKL problem. The generalized convexity of MKL is studied which guarantees the strong duality of the corresponding optimization problem. A further contribution is the utilization of a computationally efficient inexact-projection-based method to optimize the standard SVM. In addition, a nonmonotone gradient method is proposed to optimize the kernel weights and the Hessian matrix is approximated with a variation of Limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) approach. The global convergence of the proposed two-step approach is analyzed. The utility of the proposed scheme is demonstrated by empirical study with several datasets, and its performance is compared with state-of-the-art methods in terms of accuracy and scalability. The results show that the proposed method performs as well as existing methods in accuracy, but takes much less time to converge to the stationary point.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call