Abstract
In this paper a critical review of gradient-based training methods for recurrent neural networks is presented including Back Propagation Through Time (BPTT), Real Time Recurrent Learning (RTRL) and several specific learning algorithms for different locally recurrent architectures. From this survey it comes out the need for a unifying view of all the specific procedures proposed for networks with local feedbacks, that keeps into account the general framework of recurrent networks learning: BPTT and RTRL. Therefore a learning method for local feedback network is proposed which combines together the best feature of BPTT, i.e. the lowest complexity, and of RTRL, i.e. the on-line operation, and includes as special case several specific algorithms already proposed, such as Temporal Back Propagation, Back Propagation for Sequences, Back-Tsoi algorithm and some others. In the general version, this new training method allows on-line efficient and accurate gradient calculation. It compares favourably with the previous algorithms in stability, speed/complexity trade off, accuracy.KeywordsBack PropagationFinite Impulse ResponseRecurrent Neural NetworkMulti Layer PerceptronNeural ComputationThese keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.