Abstract

A novel unsupervised Bayesian learning framework based on asymmetric Gaussian mixture (AGM) statistical model is proposed since AGM is shown to be more effective compared to the classic Gaussian mixture model. The Bayesian learning framework is developed by adopting sampling-based Markov chain Monte Carlo (MCMC) methodology. More precisely, the fundamental learning algorithm is a hybrid Metropolis–Hastings within Gibbs sampling solution which is integrated within a reversible jump MCMC learning framework, a self-adapted sampling-based implementation, that enables model transfer throughout the mixture parameters learning process and therefore automatically converges to the optimal number of data groups. Furthermore, in order to handle high-dimensional vectors of features, a dimensionality reduction algorithm based on mixtures of distributions is included to tackle the irrelevant and extraneous features. The performance comparison between AGM and other popular models is given, and both synthetic and real datasets extracted from challenging applications such as intrusion detection, spam filtering and image categorization are evaluated to show the merits of the proposed approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call