Abstract

The authors present arguments for a fast online training algorithm for recurrent neural networks. They present an algorithm that would require O(N/sup 3/) calculations to update the weights in one time step, which is faster than all other known online training algorithms. They formulate the derivations of this algorithm in a variational approach which has the advantage of providing a unified view of the various algorithms that were derived by a number of researchers from very different circumstances. >

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call