Abstract

Support Vector Machine (SVM) is a supervised machine learning algorithm, which is used for robust and accurate classification. Despite its advantages, its classification speed deteriorates due to its large number of support vectors when dealing with large scale problems and dependency of its performance on its kernel parameter. This paper presents a kernel parameter optimization algorithm for Support Vector Machine (SVM) based on Sliding Mode Control algorithm in a closed-loop manner. The proposed method defines an error equation and a sliding surface, iteratively updates the Radial Basis Function (RBF) kernel parameter or the 2-degree polynomial kernel parameters, forcing SVM training error to converge below a threshold value. Due to the closed-loop nature of the proposed algorithm, key features such as robustness to uncertainty and fast convergence can be obtained. To assess the performance of the proposed technique, ten standard benchmark databases covering a range of applications were used. The proposed method and the state-of-the-art techniques were then used to classify the data. Experimental results show the proposed method is significantly faster and more accurate than the anchor SVM technique and some of the most recent methods. These achievements are due to the closed-loop nature of the proposed algorithm, which significantly has reduced the data dependency of the proposed method.

Highlights

  • Support Vector Machine (SVM) is one of the widely used machine learning classification algorithms, among other classifiers such as: nearest neighbor [1], boosted decision trees [2], regularized logistic regression [3], neural networks [4], and random forests [5]

  • This paper presents a kernel parameter optimization algorithm for Support Vector Machine (SVM) based on Sliding Mode Control algorithm in a closed-loop manner

  • The proposed method defines an error equation and a sliding surface, iteratively updates the Radial Basis Function (RBF) kernel parameter or the 2-degree polynomial kernel parameters, forcing SVM training error to converge below a threshold value

Read more

Summary

INTRODUCTION

Support Vector Machine (SVM) is one of the widely used machine learning classification algorithms, among other classifiers such as: nearest neighbor [1], boosted decision trees [2], regularized logistic regression [3], neural networks [4], and random forests [5]. Various methods have been proposed by the researchers to find optimal kernel for SVM and reducing its number of support vectors as the performance and speed of the algorithm depend on the kernel function and its parameters. The proposed method first defines the specification of an error equation and the sliding surface and it tries to arrive at a good tracking and low training error by updating the parameter(s) of those kernels This procedure will be repeated until the validation accuracy continues its decreasing trend for a specific number of iterations. The proposed method generated more accurate results in compared with some of the latest techniques All of these and its high robustness against uncertainties, which are existed in the data comes from different sources, are due to the closed-loop nature of the SMC algorithm used in conjunction with SVM method.

SUPPORT VECTOR MACHINE
POLYNOMIAL OPTIMAL KERNEL PARAMETER
Findings
SIMULATION AND EXPERIMENTAL RESULTS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.