Abstract

Least squares support vector machine (LS-SVM) is a popular tool for the analysis of time series data sets. Choosing optimal hyperparameter values for LS-SVM is an important step in time series analysis. In this paper, we combine LS-SVM with simulated annealing (SA) algorithms for nonlinear time series analysis. The LS-SVM is used to predict chaotic time series, and its parameters are automatically tuned using the SA and generalization performance is estimated by minimizing the k-fold cross-validation error. A benchmark problem, Mackey-Glass time series, has been used as example for demonstration. It is showed this approach can escape from the blindness of man-made choice of the LS-SVM parameters. It enhances the prediction capability of chaotic time series.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call