Abstract
In this paper, a Bayesian paradigm of a mixture model with finite and non-finite components is expounded for a generic prior and likelihood that can be of any distributional random noise. The mixture model consists of stylized properties-proportional allocation, sample size allocation, and latent (unobserved) variable for similar probabilistic generalization. The Expectation-Maximization (EM) algorithm technique of parameter estimation was adopted to estimate the stated stylized parameters. The Markov Chain Monte Carlo (MCMC) and Metropolis–Hastings sampler algorithms were adopted as an alternative to the EM algorithm when it is not analytically feasible, that is, when the unobserved variable cannot be replaced by imposed expectations (means) and when there is need for correction of exploration of posterior distribution by means of acceptance ratio quantity, respectively. Label switching for exchangeability of posterior distribution via truncated or alternating prior distributional form was imposed on the posterior distribution for robust tailoring inference through Maximum a Posterior (MAP) index. In conclusion, it was deduced via simulation study that the number of components grows large for all permutations to be considered for subsample permutations.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.