Abstract

In this paper we solve support vector machines in reproducing kernel Banach spaces (RKBSs) instead of the traditional methods in reproducing kernel Hilbert spaces (RKHSs). Using the orthogonality of semi-inner-products of RKBSs, we can obtain the finite-dimensional representations of the dual (normalized-duality-mapping) elements of support vector machine solutions. In addition, we can use Fourier transform techniques to introduce the concept of reproduction in a generalized native space such that it becomes a reproducing kernel Banach space, which can even be embedded into Sobolev spaces. Moreover, its reproducing kernel is associated with a positive definite function. The representations of the optimal solutions of support vector machines (regularized empirical risks) in these reproducing kernel Banach spaces are formulated explicitly and finite-dimensionally in terms of the positive definite functions, and their finite numbers of suitable parameters can be computed by the fixed point iteration. We also give some typical examples of reproducing kernel Banach spaces induced by Matérn functions (Sobolev splines) such that their support vector machine solutions even are computable efficiently. Moreover, each of their reproducing bases includes information from multiple training data points. These kernel-based algorithms give a fresh numerical tool for support vector classifiers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call