Abstract

The twin support vector machine improves the classification performance of the support vector machine by solving two small quadratic programming problems. However, this method has the following defects: (1) For the twin support vector machine and some of its variants, the constructed models use a hinge loss function, which is sensitive to noise and unstable in resampling. (2) The models need to be converted from the original space to the dual space, and their time complexity is high. To further enhance the performance of the twin support vector machine, the pinball loss function is introduced into the twin bounded support vector machine, and the problem of the pinball loss function not being differentiable at zero is solved by constructing a smooth approximation function. Based on this, a smooth twin bounded support vector machine model with pinball loss is obtained. The model is solved iteratively in the original space using the Newton-Armijo method. A smooth twin bounded support vector machine algorithm with pinball loss is proposed, and theoretically the convergence of the iterative algorithm is proven. In the experiments, the proposed algorithm is validated on the UCI datasets and the artificial datasets. Furthermore, the performance of the presented algorithm is compared with those of other representative algorithms, thereby demonstrating the effectiveness of the proposed algorithm.

Highlights

  • The support vector machine (SVM) proposed by Vapnik et al [1] is a machine learning method that is based on the principle of the Vapnik-Chervonekis (VC) dimension and structural risk minimization in statistical learning

  • The twin bounded support vector machine (TBSVM) proposed by Shao et al [10] is implemented by adding a regularization term to the objective function

  • Even if the researchers proposed a twin support vector machine solved in the original space, the model is still based on the hinge loss function

Read more

Summary

Introduction

The support vector machine (SVM) proposed by Vapnik et al [1] is a machine learning method that is based on the principle of the Vapnik-Chervonekis (VC) dimension and structural risk minimization in statistical learning. The twin bounded support vector machine (TBSVM) proposed by Shao et al [10] is implemented by adding a regularization term to the objective function It further improves the generalization performance of the TWSVM. Researchers introduced the pinball loss into twin support vector machines and proposed different versions [20,21,22] These models still need to solve two quadratic programming problems in the dual space. Even if the researchers proposed a twin support vector machine solved in the original space, the model is still based on the hinge loss function. A smooth twin bounded support vector machine algorithm with pinball loss is proposed.

Related works
Pinball loss and its smooth approximation function
Linear case
Nonlinear case
Convergence of the algorithm
Experimental results and analysis
UCI datasets
Friedman test
64 Pin-GTWSVM
Artificial datasets
NDC datasets
Toy dataset
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call