Abstract

The main aim of the paper is to briefly investigate the most significant topics of the currently used methodologies of solving and implementing SVM-based classifier. Following a brief introductory part, the basics of linear SVM and non-linear SVM models are briefly exposed in the next two sections. The problem of soft margin SVM is exposed in the fourth section of the paper. The currently used methods for solving the resulted QP-problem require access to all labeled samples at once and a computation of an optimal solution is of complexity O(N2). Several ap-proaches have been proposed aiming to reduce the computation complexity, as the interior point (IP) methods, and the decomposition methods such as Sequential Minimal Optimization – SMO, as well as gradient-based methods to solving primal SVM problem. Several approaches based on genetic search in solving the more general problem of identifying the optimal type of kernel from pre-specified set of kernel types (linear, polynomial, RBF, Gaussian, Fourier, Bspline, Spline, Sigmoid) have been recently proposed. The fifth section of the paper is a brief survey on the most outstanding new techniques reported so far in this respect.

Highlights

  • A SVM can be viewed as a classifier discriminating between the inputs coming from two classes and the training set corresponds to a sequence of labeled inputs

  • The currently used methods for solving the resulted quadratic programming (QP)-problem require access to all labeled samples at once and a computation of an optimal solution is of complexity O(N2)

  • The support vector machines are a class of The SMO algorithm [7] allows to solve the linear or kernel-based binary classifiers that SVM-QP dual problem without extra-matrix storage

Read more

Summary

Introduction

A SVM can be viewed as a classifier discriminating between the inputs coming from two classes and the training set corresponds to a sequence of labeled inputs. The research in the SVMs area focused mainly on designing fast algorithms for solving the QP optimization problem, refining the concepts aiming to extend the SVMs for discriminating between non-separable classes, and on developing. Several approaches have been proposed aiming to reduce the computation complexity, as the interior point (IP) methods, and the decomposition methods such as Sequential Minimal Optimization – SMO, as well as gradient-based methods to solving primal SVM problem. Several approaches based on genetic search in solving the more general problem of identifying the optimal type of kernel from pre-specified set of kernel types (linear, polynomial, RBF, Gaussian, Fourier, Bspline, Spline, Sigmoid) have been recently proposed. The linear SVM implements a linearly parameterized classification decision rule, corresponding to a hyperplane almost equidistant to the subsamples labeled by 1 and -1 respectively.

Objectives
Methods
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.