Abstract
The purpose of this paper is to provide an introductory tutorial on the basic ideas behind support vector machines (SVM). The paper starts with an overview of structural risk minimization (SRM) principle, and describes the mechanism of how to construct SVM. For a two-class pattern recognition problem, we discuss in detail the classification mechanism of SVM in three cases of linearly separable, linearly nonseparable and nonlinear. Finally, for nonlinear case, we give a new function mapping technique: By choosing an appropriate kernel function, the SVM can map the low-dimensional input space into the high dimensional feature space, and construct an optimal separating hyperplane with maximum margin in the feature space.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have