Abstract

To reduce the computational complexity and improve the performance of the recurrent wavelet neural network (RWNN), a novel modular recurrent neural network based on the pipelined architecture (PRWNN) with low computational complexity is presented in this paper. Its modified adaptive real-time recurrent learning (RTRL) algorithm is derived on the gradient descent approach. The PRWNN comprises a number of RWNN modules that are cascaded in a chained form and inherits the modular architectures of the pipelined recurrent neural network (PRNN) proposed by Haykin and Li. Since those modules of the PRWNN can be performed simultaneously in a pipelined parallelism fashion, it would result in a significant improvement in computational efficiency. And the performance of the PRWNN can be also further improved. Computer simulations have demonstrated that the PRWNN provides considerably better performance compared to the single RWNN model for nonlinear dynamic system identification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call