Abstract
We consider the probabilistic neural network (PNN) that is a mixture of Gaussian basis functions having different variances. Such a Gaussian heteroscedastic PNN is more economic, in terms of the number of kernel functions required, than the Gaussian mixture PNN of a common variance. The expectation-maximisation (EM) algorithm, although a powerful technique for constructing maximum likelihood (ML) homoscedastic PNNs, often encounters numerical difficulties when training heteroscedastic PNNs. We combine a robust statistical technique known as the Jack-knife with the EM algorithm to provide a robust ML training algorithm. An artificial-data case, the two-dimensional XOR problem, and a real-data case, success or failure prediction of UK private construction companies, are used to evaluate the performance of this robust learning algorithm.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.