Abstract
A new approach to the estimation of 'a posteriori' class probabilities using neural networks, the Joint Network and Data Density Estimation (JNDDE), is presented. It is based on the estimation of the conditional data density functions, with some restrictions imposed by the classifier structure; the Bayes' rule is used to obtain the 'a posteriori' probabilities from these densities. The proposed method is applied to three different network structures: the logistic perceptron (for the binary case), the softmax perceptron (for multi-class problems) and a generalized softmax perceptron (that can be used to map arbitrarily complex probability functions). Gaussian mixture models are used for the conditional densities, The method has the advantage of establishing a distinction between the network architecture constraints and the model of the data, separating network parameters and the model parameters. Complexity on any of them can be fixed as desired. Maximum likelihood gradient-based rules for the estimation of the parameters can be obtained. It is shown that JNDDE exhibits a more robust convergence characteristics than other methods of a posteriori probability estimation, such as those based on the minimization of a Strict Sense Bayesian cost function.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.