Abstract

This is short overview of the authors’ research in the area of the sequential or recursive Bayesian estimation of recurrent neural networks. Our approach is founded on the joint estimation of synaptic weights, neuron outputs and structure of the recurrent neural networks. Joint estimation enables generalization of the training heuristic known as teacher forcing, which improves the training speed, to the sequential training on noisy data. By applying Gaussian mixture approximation of relevant probability density functions, we have derived training algorithms capable to deal with non-Gaussian (multi modal or heavy tailed) noise on training samples. Finally, we have used statistics, recursively updated during sequential Bayesian estimation, to derive criteria for growing and pruning of synaptic connections and hidden neurons in recurrent neural networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call