Abstract
The support vector machine (SVM) is an increasingly important tool in machine learning. Despite its popularity, the SVM classifier can be adversely affected under the presence of noise in the training dataset. The SVM can be fit in the regularization framework of Loss + Penalty. The loss function plays an essential role which is used to keep the fidelity of the resulting model to the data. Most SVMs use convex losses, however, they often suffer from the negative impact of points far away from their own classes. This paper proposes a new nonconvex differentiable loss, namely huberied truncated pinball loss, which can be able to reduce the effects of noise in the training sample. The SVM classifier with the huberied truncated pinball loss (HTPSVM) is proposed. The HTPSVM combines the elastic net penalty and the nonconvex huberied truncated pinball loss. It inherits the benefits of both ℓ1 and ℓ2 norm regularizers. The HTPSVM involves nonconvex minimization, the accelerated proximal gradient (APG) algorithm was used to solve the corresponding optimization. To evaluate the performance of classifiers, classification accuracy and area under ROC curve (AUC) were employed as the accuracy indicators. The numerical results show that our new classifier is effective. Friedman and Nemenyi post hoc tests of the experimental results indicate that the proposed HTPSVM is shown to be more robust to noise than HSVM, PSVM and HHSVM.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have