Abstract
In this paper we solve support vector machines in reproducing kernel Banach spaces (RKBSs) instead of the traditional methods in reproducing kernel Hilbert spaces (RKHSs). Using the orthogonality of semi-inner-products of RKBSs, we can obtain the finite-dimensional representations of the dual (normalized-duality-mapping) elements of support vector machine solutions. In addition, we can use Fourier transform techniques to introduce the concept of reproduction in a generalized native space such that it becomes a reproducing kernel Banach space, which can even be embedded into Sobolev spaces. Moreover, its reproducing kernel is associated with a positive definite function. The representations of the optimal solutions of support vector machines (regularized empirical risks) in these reproducing kernel Banach spaces are formulated explicitly and finite-dimensionally in terms of the positive definite functions, and their finite numbers of suitable parameters can be computed by the fixed point iteration. We also give some typical examples of reproducing kernel Banach spaces induced by Matérn functions (Sobolev splines) such that their support vector machine solutions even are computable efficiently. Moreover, each of their reproducing bases includes information from multiple training data points. These kernel-based algorithms give a fresh numerical tool for support vector classifiers.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.