Abstract

We consider a variant of the conventional neural network model, called the stochastic neural network, that can be used to approximate complex nonlinear stochastic systems. We show how the expectation-maximization algorithm can be used to develop efficient estimation schemes that have much lower computational complexity than those for conventional neural networks. This enables us to carry out model selection procedures, such as the Bayesian information criterion, to choose the number of hidden units and the input variables for each hidden unit. Stochastic neural networks are shown to have the universal approximation property of neural networks. Other important properties of the proposed model are given, and model-based multistep-ahead forecasts are provided. We fit stochastic neural network models to several real and simulated time series. Results show that the fitted models improve postsample forecasts over conventional neural networks and other nonlinear and nonparametric models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call