Abstract

We present a sequential sampling methodology for weakly structural Markov laws, arising naturally in a Bayesian structure learning context for decomposable graphical models. As a key component of our suggested approach, we show that the problem of graph estimation, which in general lacks natural sequential interpretation, can be recast into a sequential setting by proposing a recursive Feynman-Kac model that generates a flow of junction tree distributions over a space of increasing dimensions. We focus on particle McMC methods to provide samples on this space, in particular on particle Gibbs (PG), as it allows for generating McMC chains with global moves on an underlying space of decomposable graphs. To further improve the PG mixing properties, we incorporate a systematic refreshment step implemented through direct sampling from a backward kernel. The theoretical properties of the algorithm are investigated, showing that the proposed refreshment step improves the performance in terms of asymptotic variance of the estimated distribution. The suggested sampling methodology is illustrated through a collection of numerical examples demonstrating high accuracy in Bayesian graph structure learning in both discrete and continuous graphical models.

Highlights

  • Understanding the underlying dependence structure of a multivariate distribution is becoming increasingly important in modern applications when analysing complex data

  • A specific family of undirected graphical models extensively studied in the literature are those which are Markov with respect to decomposable graphs, usually referred to as decomposable graphical models (DGMs), to which we restrict our attention in the present paper

  • We present a procedure for recasting the problem of structure learning in weakly structural Markov (WSM) laws, which in general lacks natural sequential interpretation, into a sequential setting by an auxiliary construction that we refer to as a temporal embedding, relying partly on the methodology of sequential Monte Carlo (SMC) samplers; see Del Moral, Doucet and Jasra (2006)

Read more

Summary

Introduction

Understanding the underlying dependence structure of a multivariate distribution is becoming increasingly important in modern applications when analysing complex data. The common strategy of Bayesian structure learning i based on the class of Markov chain Monte Carlo (McMC) methods such as e.g. the MetropolisHastings sampling scheme These methods generate, by performing local perturbations on the edge set, Markov chains by either operating directly on the space of decomposable graph or their corresponding junction trees; see for example. The main issue for the above-mentioned samplers as well as other McMC strategies based on local moves is the limited mobility of their corresponding Markov chains, since at each step, only a small part of the edge set is altered To tackle this issue, we present a procedure for recasting the problem of structure learning in WSM laws, which in general lacks natural sequential interpretation, into a sequential setting by an auxiliary construction that we refer to as a temporal embedding, relying partly on the methodology of sequential Monte Carlo (SMC) samplers; see Del Moral, Doucet and Jasra (2006). Appendix A contains some graph theoretical notations, proofs and a lemma

Preliminaries
Temporal embedding of weakly structural Markov laws
Particle approximation of temporalised weakly structural Markov laws
The Christmas tree algorithm
Particle Gibbs sampling
Particle Gibbs with systematic refreshment
Application to decomposable graphical models
Numerical study
Czech autoworkers data
Continuous data with temporal dependence
Comparison to the Metropolis-Hastings algorithm
Graph theory
Proofs and lemmas
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call