Abstract

A Bayesian selective combination method is proposed for combining multiple neural networks in nonlinear dynamic process modelling. Instead of using fixed combination weights, the probability of a particular network being the true model is used as the combination weight for combining that network. The prior probability is calculated using the sum of squared errors of individual networks on a sliding window covering the most recent sampling times. A nearest neighbour method is used for estimating the network error for a given input data point, which is then used in calculating the combination weights for individual networks. Forward selection and backward elimination are used to select the individual networks to be combined. In forward selection, individual networks are gradually added into the aggregated network until the aggregated network error on the original training and testing data sets cannot be further reduced. In backward elimination, all the individual networks are initially aggregated and some of the individual networks are then gradually eliminated until the aggregated network error on the original training and testing data sets cannot be further reduced. Application results demonstrate that the proposed techniques can significantly improve model generalisation and perform better than aggregating all the individual networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call