Abstract

Multiply sectioned Bayesian networks (MSBNs) provide a coherent and flexible formalism for representing uncertain knowledge in large domains. Global consistency among subnets in an MSBN is achieved by communication. When a subnet updates its belief with respect to an adjacent subnet, existing inference operations require repeated belief propagations (proportional to the number of linkages between the two subnets) within the receiving subnet, making communication less efficient. We redefine these operations such that two such propagations are sufficient. We prove that the new operations, while improving the efficiency, do not compromise the coherence. An MSBN must be initialized before inference can take place. The initialization involves dedicated operations not shared by inference operations according to existing methods. We show that the new inference operations presented here unify inference and initialization. Hence the new operations are not only more efficient but also simpler. The new results are presented such that their connection with the common inference methods for single Bayesian networks is highlighted.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call