Abstract
The paper introduces a robust connectionist technique for the empirical nonparametric estimation of multivariate probability density functions (pdf) from unlabeled data samples (still an open issue in pattern recognition and machine learning). To this end, a soft-constrained unsupervised algorithm for training a multilayer perceptron (MLP) is proposed. A variant of the Metropolis–Hastings algorithm (exploiting the very probabilistic nature of the present MLP) is used to guarantee a model that satisfies numerically Kolmogorov’s second axiom of probability. The approach overcomes the major limitations of the established statistical and connectionist pdf estimators. Graphical and quantitative experimental results show that the proposed technique can offer estimates that improve significantly over parametric and nonparametric approaches, regardless of (1) the complexity of the underlying pdf, (2) the dimensionality of the feature space, and (3) the amount of data available for training.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have