Abstract

Recently, introducing nonconvex loss functions in support vector machine (SVM) to improve the robustness against varies noises has been drawing much attention. In this paper, we first construct a new robust capped asymmetric elastic net (CaEN) loss function. Second, we describe a novel robust Huberized kernel-based (HK) loss function and theoretically demonstrate several important properties, such as smoothness, boundness and the trade-off between the standard least squares and the truncated least squares. Finally, we apply the CaEN loss and the HK loss into elastic net nonparallel hyperplane SVM (ENNHSVM) to develop a fused robust geometric nonparallel SVM (FRGNHSVM). The proposed FRGNHSVM not only inherits the advantages of ENNHSVM but also improves the robustness of classification problems. An efficient Pegasos-based DC (difference of convex functions) algorithm is implemented to solve the FRGNHSVM optimization problem. In comparison with four famous SVMs, including Lagrangian SVM, twin SVM, pinball SVM and C-loss twin SVM, experimental results on simulations and twelve UCI datasets show that the proposed FRGNHSVM can often improve more than 5% average prediction accuracy. Moreover, the performance of the high prediction accuracy of FRGHNSVM is more significant along with the ratio of label noise increasing, indicating its superiority in dealing with label-contaminated datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call