Abstract
Within a Bayesian framework, a comprehensive investigation of mixtures of finite mixtures (MFMs), i.e., finite mixtures with a prior on the number of components, is performed. This model class has applications in model-based clustering as well as for semi-parametric density estimation and requires suitable prior specifications and inference methods to exploit its full potential. We contribute by considering a generalized class of MFMs where the hyperparameter γK of a symmetric Dirichlet prior on the weight distribution depends on the number of components. We show that this model class may be regarded as a Bayesian non-parametric mixture outside the class of Gibbs-type priors. We emphasize the distinction between the number of components K of a mixture and the number of clusters K+, i.e., the number of filled components given the data. In the MFM model, K+ is a random variable and its prior depends on the prior on K and on the hyperparameter γK. We employ a flexible prior distribution for the number of components K and derive the corresponding prior on the number of clusters K+ for generalized MFMs. For posterior inference we propose the novel telescoping sampler which allows Bayesian inference for mixtures with arbitrary component distributions without resorting to reversible jump Markov chain Monte Carlo (MCMC) methods. The telescoping sampler explicitly samples the number of components, but otherwise requires only the usual MCMC steps of a finite mixture model. The ease of its application using different component distributions is demonstrated on several data sets.
Highlights
The present paper contributes to Bayesian mixture analysis where the number of components K is unknown and a prior on K is specified
The class of generalized mixtures of finite mixtures (MFMs) we introduce in this paper is a finite mixture model with a prior on K, where the hyperparameter γ ∼ DK (γK) may change as a function of K
We focus on the dynamic MFM where γK = α ∼ DK (α/K) is inversely proportional to the number of components K and show that it converges to a Dirichlet process mixtures (DPMs) with concentration parameter α, if the prior p(K) puts all mass on +∞
Summary
The present paper contributes to Bayesian mixture analysis where the number of components K is unknown and a prior on K is specified. The dynamic MFM uses γK = α/K and can induce a dynamic SFM with a prior on K This MFM specification, considered previously in McCullagh and Yang (2008), is less common in applied finite mixture analysis than the static MFM. Exploiting that static MFMs are Gibbs-type priors, Miller and Harrison (2018) introduced sampling techniques from BNP statistics to finite mixture analysis. Sampling K only depends on the current partition of the data and is independent of the component parameters This makes our sampler a most generic inference tool for finite mixture models with an unknown number of components which can be applied to arbitrary mixture families.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have