Abstract

This paper proposes a generic criterion that defines the optimum number of basis functions for radial basis function (RBF) neural networks. The generalization performance of an RBF network relates to its prediction capability on independent test data. This performance gives a measure of the quality of the chosen model. An RBF network with an overly restricted basis gives poor predictions on new data, since the model has too little flexibility (yielding high bias and low variance). By contrast, an RBF network with too many basis functions also gives poor generalization performance since it is too flexible and fits too much of the noise on the training data (yielding low bias but high variance). Bias and variance are complementary quantities, and it is necessary to assign the number of basis function optimally in order to achieve the best compromise between them. In this paper we use Stein's unbiased risk estimator to derive an analytical criterion for assigning the appropriate number of basis functions. Two cases of known and unknown noise have been considered and the efficacy of this criterion in both situations is illustrated experimentally. The paper also shows an empirical comparison between this method and two well known classical methods, cross validation and the Bayesian information criterion, BIC.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.