Abstract

In this paper, we propose a stochastic gradient descent algorithm, called stochastic gradient descent method-based generalized pinball support vector machine (SG-GPSVM), to solve data classification problems. This approach was developed by replacing the hinge loss function in the conventional support vector machine (SVM) with a generalized pinball loss function. We show that SG-GPSVM is convergent and that it approximates the conventional generalized pinball support vector machine (GPSVM). Further, the symmetric kernel method was adopted to evaluate the performance of SG-GPSVM as a nonlinear classifier. Our suggested algorithm surpasses existing methods in terms of noise insensitivity, resampling stability, and accuracy for large-scale data scenarios, according to the experimental results.

Highlights

  • Introduction the Using Generalized Pinball LossSupport vector machine (SVM) is a popular supervised binary classification algorithm based on statistical learning theory

  • Compared to the hinge loss support vector machine (SVM) and Pegasos, the major advantage of our proposed method is that SG-generalized pinball support vector machine (GPSVM) is less sensitive to noise, especially the feature noise around the decision boundary

  • We investigated the convergence of SG-GPSVM and the theoretical approximation between GPSVM and SG-GPSVM

Read more

Summary

Introduction the Using Generalized Pinball Loss

Support vector machine (SVM) is a popular supervised binary classification algorithm based on statistical learning theory. A modified e-insensitive zone for Pin-SVM was proposed This method does not consider the patterns that lie in the insensitive zone while building the classifier, and its formulation requires the value of e to be specified beforehand; a bad choice may affect its performance. Rastogi [17] recently proposed the modified (e1 , e2 )-insensitive zone support vector machine This method is an extension of existing loss functions that account for noise sensitivity and resampling stability. In order to overcome the above-mentioned limitations of large-scale problems and inspired by the studies of SVM and the generalized pinball loss function, we propose a novel stochastic subgradient descent method with generalized pinball support vector machine (SG-GPSVM).

Related Work and Background
Linear Case
Nonlinear Case
Convergence Analysis
Numerical Experiments
Artificial Datasets
UCI Datasets
Large-Scale Dataset
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.