Abstract

Bayesian Networks (BNs) are probabilistic graphical models used to represent and encode uncertain expert knowledge. BNs stand out for dealing with uncertainty in decision making and statistical inference, and many algorithms were described for inference in BNs, see Dechter (1996), Heckerman (1995), Jensen (1996), Lauritzen (1988), Pearl (1988), and Zang (1996). The parallel algorithm described in this paper is based on the sequential variable elimination algorithm of Cozman( 2000), using algebraic operations on potentials. These algebraic schemata for inference in BNs are not only relatively simple to understand and to implement, but also allow us to use the techniques, heuristics and abstract combinatorial structures from the sparse matrix factorizations literature, see George (1993) and Stern (1994, 2006, 2008). The main goal of this paper is to show how variations of the variable elimination algorithm can be combined with sparse matrix factorization methods to implement a fast and efficient parallel algorithm for inference in BNs. This goal is achieved with the complete separation between a first symbolic phase, and a second numerical phase. In the symbolic phase the proposed algorithm explores the graphical structure of the model, without computing or even accessing probabilistic information. The second numerical phase can be fully vectorized and parallelized using static data structures previously defined in the first phase. This is done examining the decoupling or separation operators of sparse matrix factorization algorithms and BNs inference procedures from a unified combinatorial framework. This unified framework is the key for implementing efficiently this parallel algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call