Abstract

Proximal support vector machine (PSVM), as a variant of support vector machine (SVM), is to generate a pair of non-parallel hyperplanes for classification. Although PSVM is one of the powerful classification tools, its ability on feature selection is still weak. To overcome this defect, we introduce ℓ0-norm regularization in PSVM which enables PSVM to select important features and remove redundant features simultaneously for classification. This PSVM is called as a sparse proximal support vector machine (SPSVM). Due to the presence of ℓ0-norm, the resulting optimization problem of SPSVM is neither convex nor smooth and thus, is difficult to solve. In this paper, we introduce a continuous nonconvex function to approximate ℓ0-norm, and propose a novel difference of convex functions algorithms (DCA) to solve SPSVM. The main merit of the proposed method is that all subproblems are smooth and admit closed form solutions. The effectiveness of the proposed method is illustrated by theoretical analysis as well as some numerical experiments on both simulation datasets and real world datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.