Abstract

Recently, the development of deep learning technology has promoted the wide application of machine learning. In particular, the technique known as reservoir computing has attracted more and more attention due to its excellent behavior on chaotic time series prediction, and has formed a new research hotspot. In this paper, we use traditional long short-term memory (LSTM) neural networks and fully connected layers as the fundamental elements to build a LSTM learning machine based on the recurrent neural network architecture. The motivation for using LSTM is that it can effectively prevent the vanishing and exploding of gradients. In the simulation experiment, we quantify the duration of accurate prediction with the average valid time and use the model to predict the state of Lorenz system. Aiming at the special dynamic properties of chaotic system, we propose four strategies to assist prediction, which are normalization and restoration, scaling down the gradients, reservoir-based initialization and preserving the optimal model. The results show that the prediction ability of the LSTM learning machine with suitable strategies is comparable to that of reservoir computing, furthermore, the complexity of the LSTM is lower. Therefore, our results indicate that there is no obvious evidence that reservoir computing can surpass the traditional method, which inspires us to further study the mechanism and methods of learning machine for predicting time series, and find more effective learning machines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call