Abstract

A statistical formulation of recurrent backpropagation (RBP) allows direct noise boosting for time-varying classification and regression. The noise boost reduces training iterations and improves accuracy. The injected noise is just that noise that makes the current signal more probable. This noise-boost result extends the two recent results that backpropagation is a special case of the generalized expectation maximization (EM) algorithm and that careful noise injection can always speed the average convergence of the EM algorithm to a local maximum of the log-likelihood surface. The noise-benefit conditions differ for additive and multiplicative noise in RBP. We tested noise-boosted RBP classifiers on 11 classes of sports video clips and tested RBP regressors on predicting the dollar-rupee exchange rate. Injecting noisy-EM (NEM) noise outperformed injecting blind noise or injecting no noise at all. Additive NEM noise usually outperformed multiplicative noise. The best case of NEM noise injection with RBP training of a recurrent neural classification model speeded up its training by 60% and improved its classification accuracy by 9.51% compared with noiseless RBP training and accuracy. The best performance of the NEM noise with the RBP training of a recurrent neural regression model yielded a 38% speed-up in training and also reduced the squared error by 49.3%. The injection of the additive NEM noise in the output and hidden neurons performed best.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.