Abstract

Support vector machine (SVM) is widely recognized as an effective classification tool and has demonstrated superior performance in diverse applications. However, for large-scale pattern classification problems, it may require much memory and incur prohibitive computational costs. Motivated by this, we propose a new SVM model with novel generalized ramp loss (LR-SVM). The first-order optimality conditions for the non-convex and non-smooth LR-SVM are developed by the newly developed P-stationary point, based on which, the LR support vectors and working set of LR-SVM are defined, interestingly, which shows that all of the LR support vectors are on the two support hyperplanes under mild conditions. A fast proximal alternating direction method of multipliers with working set (LR-ADMM) is developed to handle LR-SVM and LR-ADMM has been demonstrated to achieve global convergence while maintaining a significantly low computational complexity. Numerical comparisons with nine leading solvers show that LR-ADMM demonstrates outstanding performance, particularly when applied to large-scale pattern classification problems with fewer support vectors, higher prediction accuracy and shorter computational time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call