Abstract

Kernels offer an effective alternative to implicitly embed the original data into a higher or infinite-dimensional space in support vector machines. Kernel learning, which attempts to determine optimal kernel functions to evaluate relationships between data, has garnered increasing interest. Employing multiple kernels to enhance optimality and generalization is a promising direction. In this study, we focused on the parameter optimization problem of Hadamard kernel functions, which is a newly proposed kernel in machine learning. Motivated by the multiple kernel learning framework in optimizing kernel combinations and the intriguing properties that L4-norm possess, we proposed a high-order L4Lp(p⩾3) norm-product regularized multiple kernel learning framework to optimize the discrimination performance, where hinge, log, and square loss functions are detailed. We demonstrated that the Hadamard multiple kernel learning can effectively obtain the optimal performance while implicitly avoiding the parameter specification difficulty by optimizing the linear combination of Hadamard kernel functions over different kernel parameters. The effectiveness of the proposed approach was verified through experiments on several benchmark datasets. In addition, the high-order L4Lp(p⩾3) norm-product regularized multiple kernel learning framework can be used to optimize radial basis function kernels under different kernel parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call