Abstract

Recurrent neural networks (RNN) based on a long short-term memory (LSTM) are used for predicting the future out of a given set of time series data. Usually, only one future time step is predicted. In this article, the capability of LSTM networks for a wide look into the future is explored. The time series data are taken from the evolution of share prices from stock trading. As expected, the longer the view into the future the stronger the deviations between prediction and reality. However, strange memory effects are observed. They range from periodic predictions (with time periods of the order of one month) to predictions that are an exact copy of a long-term sequence from far previous data. The trigger mechanisms for recalling memory in LSTM networks seem to be rather independent of the behaviour of the time-series data within the last “sliding window" or “batch". Similar periodic predictions are also observed for GRU networks and if the trainable parameters are reduced drastically. A better understanding of the influence of regularisations details of RNNs may be helpful for improving their predictive power.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.