Abstract

Support Vector Regression (SVR) is an attractive approach for data modeling. The SVR is based on mapping nonlinear input to a linear in the feature space. Instead of minimizing the observed training error, SVR minimizes the generalization error bound using structural risk minimization in combine with a kernel trick control. The model selection plays an important role to the performance of SVR. Therefore, in SVR problems, we attempt to generalize the model by maximizing the margin. Based on experimental results, intelligent model selection is crucial to avoid over fitting and overestimating of generalization capability in such a multidimensional dataset. SVR techniques for choosing the kernel function and additional capacity control is still ongoing research. In this paper, we develop Genetic Folding (GF) for kernel selection of SVR. This methodology was motivated and proofed by our previous published works [4] in classification models. At the end, we have shown comparative results in comparing to predefined kernel models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call