Abstract

In this paper, a Bayesian paradigm of a mixture model with finite and non-finite components is expounded for a generic prior and likelihood that can be of any distributional random noise. The mixture model consists of stylized properties-proportional allocation, sample size allocation, and latent (unobserved) variable for similar probabilistic generalization. The Expectation-Maximization (EM) algorithm technique of parameter estimation was adopted to estimate the stated stylized parameters. The Markov Chain Monte Carlo (MCMC) and Metropolis–Hastings sampler algorithms were adopted as an alternative to the EM algorithm when it is not analytically feasible, that is, when the unobserved variable cannot be replaced by imposed expectations (means) and when there is need for correction of exploration of posterior distribution by means of acceptance ratio quantity, respectively. Label switching for exchangeability of posterior distribution via truncated or alternating prior distributional form was imposed on the posterior distribution for robust tailoring inference through Maximum a Posterior (MAP) index. In conclusion, it was deduced via simulation study that the number of components grows large for all permutations to be considered for subsample permutations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call