Abstract

Comparing with Lp-norm (1<p<+∞) Support Vector Machines (SVMs), the L1-norm SVM enjoys the nice property of simultaneously performing classification and feature selection. Margin error bounds for SVM on Hilbert spaces (or on more general q-uniformly smooth Banach spaces) have been obtained in the literature to justify the strategy of maximizing the margin in SVM. In this paper, we devote to estimating the margin error bound for L1-norm SVM methods and giving a geometrical interpretation for the result. We show that the fat-shattering dimension of the Banach spaces ℓ1 and ℓ∞ are both infinite. Therefore, we establish margin error bounds for the SVM on finite dimensional spaces with L1-norm, thus supplying statistical justification for the large margin classification of L1-norm SVM on finite dimensional spaces. To complete the theory, corresponding results for the L∞-norm SVM are also presented.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.