Abstract

An analysis of nonlinear time series prediction schemes, realised though advanced Recurrent Neural Network (RNN) techniques is provided. Due to practical constraints in using common RNNs, such as the problem of vanishing gradient, some other ways to improve RNN based prediction are analysed. This is undertaken for a simple RNN through to the Pipelined Recurrent Neural Network (PRNN), which consists of a number of nested small-scale RNNs. A Nonlinear AutoRegressive Moving Average (NARMA) nonlinear model is introduced in the context of RNN architectures, and an posteriori mode of operation within that framework. Moreover, it is shown that the basic a priori PRNN structure exhibits certain a posteriori features. The PRNN based predictor, is shown to exhibit nesting, and to be able to represent block cascaded stochastic models, such as the Wiener–Hammerstein model. Simulations undertaken on a speech signal support the analysis.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call