Abstract

Support vector machines (SVMs), with their roots in Statistical Learning Theory (SLT) and optimization methods, have become powerful tools for problem solution in machine learning. SVMs reduce most machine learning problems to optimization problems and optimization lies at the heart of SVMs. Lots of SVM algorithms involve solving not only convex problems, such as linear programming, quadratic programming, second order cone programming, semi-definite programming, but also non-convex and more general optimization problems, such as integer programming, semi-infinite programming, bi-level programming and so on. The purpose of this paper is to understand SVM from the optimization point of view, review several representative optimization models in SVMs, their applications in economics, in order to promote the research interests in both optimization-based SVMs theory and economics applications. This paper starts with summarizing and explaining the nature of SVMs. It then proceeds to discuss optimization models for SVM following three major themes. First, least squares SVM, twin SVM, AUC Maximizing SVM, and fuzzy SVM are discussed for standard problems. Second, support vector ordinal machine, semisupervised SVM, Universum SVM, robust SVM, knowledge based SVM and multi-instance SVM are then presented for nonstandard problems. Third, we explore other important issues such as lp-norm SVM for feature selection, LOOSVM based on minimizing LOO error bound, probabilistic outputs for SVM, and rule extraction from SVM. At last, several applications of SVMs to financial forecasting, bankruptcy prediction, credit risk analysis are introduced.

Highlights

  • Support vector machines (SVMs), which were introduced by Vapnik and his coworkers in the early 1990’s (Cortes, Vapnik 1995; Vapnik 1996, 1998), are proved to be effective and promising techniques for data mining (Peng et al 2008; Yang, Wu 2006)

  • SVMs, unlike traditional methods (e.g. Neural Networks), having their roots in Statistical Learning Theory (SLT) and optimization methods, become powerful tools to solve the problems of machine learning with finite training points and overcome some traditional difficulties such as the “curse of dimensionality”, “over-fitting” and etc

  • SVMs reduce most machine learning problems to optimization problems, optimization lies at the heart of SVMs, especially the convex optimization problem plays an important role in SVMs

Read more

Summary

Introduction

Support vector machines (SVMs), which were introduced by Vapnik and his coworkers in the early 1990’s (Cortes, Vapnik 1995; Vapnik 1996, 1998), are proved to be effective and promising techniques for data mining (Peng et al 2008; Yang, Wu 2006). SVMs’ theoretical foundation and implementation techniques have been established and SVMs are gaining quick development and popularity due to a number of their attractive features: nice mathematical representations, geometrical explanations, good generalization abilities and promising empirical performance (Cristianini, Shawe-Taylor 2000; Deng, Tian 2004, 2009; Deng et al 2012; Herbrich 2002; Schölkopf, Smola 2002).

The nature of C-Support vector machines
Optimization models of support vector machines
Models for standard problems
Least squares support vector machine
Twin support vector machine
AUC maximizing support vector machine
Fuzzy support vector machine
Models for nonstandard problems
Support vector ordinal regression
Semi-supervised support vector machine
Universum support vector machine
Robust support vector machine
Knowledge based support vector machine
Multi-instance support vector machine
Feature selection via SVMs
LOO error bounds for SVMs
Probabilistic outputs for support vector machines
Rule extraction from support vector machines
Applications in economics
Remarks and future directions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call