Abstract

Reservoir Computing (RC) is a framework based on recurrent neural networks for high-speed learning and has attracted much attention. RC has been applied to a variety of temporal recognition tasks. Especially, Jaeger showed that the Echo State Network (ESN), which is one of the RC models, was effective for chaotic time series prediction tasks. However, there are two inevitable problems in nonlinear time series prediction using the standard ESN. One is that its prediction ability reaches a saturation point as the reservoir size is increased. The other is that its prediction ability depends heavily on hyperparameter values. In this paper, we propose a multi-step learning ESN to solve these problems. The proposed system has multiple reservoirs and the prediction error of one ESN-based predictor is corrected by another subsequent predictor. We demonstrate the effectiveness of the proposed method in two nonlinear time series prediction tasks. Another experiment using Lyapunov exponents suggests that the performance of the proposed method is robust against changes in hyperparameter values. In addition, we clarify the characteristic of the proposed method with regard to nonlinearity and memory using simple function approximation tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call