Abstract

Traditional neural network training is usually based on the maximum likelihood to obtain the appropriate network parameters including weights and biases given the training data. However, if the available data are finite and noisy, the maximum likelihood-based network training can cause the neural network after being trained to overfit the noisy data. This problem has been overcome by using the Bayesian inference applied to the neural network training in various applications. The Bayesian inference can allow values of regularization parameters to be found using only the training data. In addition, the Bayesian approach also allows different models (e.g., neural networks with different numbers of hidden units to be compared using only the training data). Neural networks trained with Bayesian inference can be also known as Bayesian neural networks (BNNs). This chapter focuses on BNNs for classification problems with considerations of model complexity of BNNs conveniently handled by a method known as the evidence framework.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.