Abstract

Twin support vector regression (TSVR) generates two nonparallel hyperplanes by solving a pair of smaller-sized problems instead of a single larger-sized problem in the standard SVR. Due to its efficiency, TSVR is frequently applied in various areas. In this paper, we propose a totally new version of TSVR named Linear Twin Quadratic Surface Support Vector Regression (LTQSSVR), which directly uses two quadratic surfaces in the original space for regression. It is worth noting that our new approach not only avoids the notoriously difficult and time-consuming task for searching a suitable kernel function and its corresponding parameters in the traditional SVR-based method but also achieves a better generalization performance. Besides, in order to make further improvement on the efficiency and robustness of the model, we introduce the 1-norm to measure the error. The linear programming structure of the new model skips the matrix inverse operation and makes it solvable for those huge-sized problems. As we know, the capability of handling large-sized problem is very important in this big data era. In addition, to verify the effectiveness and efficiency of our model, we compare it with some well-known methods. The numerical experiments on 2 artificial data sets and 12 benchmark data sets demonstrate the validity and applicability of our proposed method.

Highlights

  • Support Vector Machine (SVM) was first introduced by Vapnik in [1, 2]

  • At present, SVMbased regression models are mainly composed of Support Vector Regression (SVR) [8], Least Square Support Vector Regression (LSSVR) [9], and Twin Support Vector Regression (TSVR) [10]. ese models have been widely utilized in various areas, such as stock market forecasting [11, 12], image understanding [13, 14], and pattern recognition [15]

  • Luo et al [17] proposed a kernel-free fuzzy quadratic surface support vector machine (FQSSVM) model which directly generates a quadratic surface for the classification

Read more

Summary

Introduction

Support Vector Machine (SVM) was first introduced by Vapnik in [1, 2]. It is a classical classification method based on the principle of structural risk minimization. TSVR has a better generalization ability than the traditional SVR Another advantage of TSVR is that the final regression function is obtained by solving two small-sized quadratic programming problems, which leads to a lower computational burden. Luo et al [17] proposed a kernel-free fuzzy quadratic surface support vector machine (FQSSVM) model which directly generates a quadratic surface for the classification In this way, it can skip the notorious searching process for a proper kernel function in the classical kernel-based SVM. We can further greatly relieve the computational burden and speed up the efficiency Compared with those benchmark SVM-based regression models, our new model beats them with a slightly better performance and leads far ahead with a much higher efficiency in building and solving the model.

Related Works
Twin Quadratic Surface Support Vector Regression
Linear Twin Quadratic Surface Support Vector Regression
Numerical Experiment and Discussion
Findings
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call