Abstract

Compared to the standard support vector machine, the generalized eigenvalue proximal support vector machine coped well with the “Xor” problem. However, it was based on the squared Frobenius norm and hence was sensitive to outliers and noise. To improve the robustness, this paper introduces capped L 1 -norm into the generalized eigenvalue proximal support vector machine, which employs nonsquared L 1 -norm and “capped” operation, and further proposes a novel capped L 1 -norm proximal support vector machine, called CPSVM. Due to the use of capped L 1 -norm, CPSVM can effectively remove extreme outliers and suppress the effect of noise data. CPSVM can also be viewed as a weighted generalized eigenvalue proximal support vector machine and is solved through a series of generalized eigenvalue problems. The experimental results on an artificial dataset, some UCI datasets, and an image dataset demonstrate the effectiveness of CPSVM.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.