Abstract

Echo state network (ESN), a type of special recurrent neural network with a large-scale randomly fixed hidden layer (called a reservoir) and an adaptable linear output layer, has been widely employed in the field of time series analysis and modeling. However, when tackling the problem of multidimensional chaotic time series prediction, due to the randomly generated rules for input and reservoir weights, not only the representation of valuable variables is enriched but also redundant and irrelevant information is accumulated inevitably. To remove the redundant components, reduce the approximate collinearity among echo-state information, and improve the generalization and stability, a new method called hierarchical ESN with sparse learning (HESN-SL) is proposed. The HESN-SL mines and captures the latent evolution patterns hidden from the dynamic system by means of layer-by-layer processing in stacked reservoirs, and leverage monotone accelerated proximal gradient algorithm to train a sparse output layer with variable selection capability. Meanwhile, we further prove that the HESN-SL satisfies the echo state property, which guarantees the stability and convergence of the proposed model when applied to time series prediction. Experimental results on two synthetic chaotic systems and a real-world meteorological dataset illustrate the proposed HESN-SL outperforms both original ESN and existing hierarchical ESN-based models for multidimensional chaotic time series prediction.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call