Abstract

We propose efficient Markov chain Monte Carlo samplers for posterior simulation in conjugate Bayesian nonparametric models. The Gibbs sampler is easily implemented but only updates cluster membership one item at a time and thus can be hampered by poor mixing. As such, the literature suggests augmenting Gibbs scans with merge-split proposals, potentially performing dramatic updates to move between posterior modes. In contrast to less efficient merge-split samplers, we introduce a class of merge-split samplers whose updates are computationally efficient and well supported by the posterior distribution. A key feature of our samplers is that they propose updates through the sequential allocation of items. To investigate the efficiency of existing and our novel samplers, we simulate multiple replications of six different dataset/model scenarios and show that our sequentially allocated merge-split samplers achieve the highest relative computational efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call