Abstract

We present a systematic approach to mean-field theory (MFT) in a general probabilistic setting without assuming a particular model. The mean-field equations derived here may serve as a local, and thus very simple, method for approximate inference in probabilistic models such as Boltzmann machines or Bayesian networks. Our approach is ‘model-independent’ in the sense that we do not assume a particular type of dependences; in a Bayesian network, for example, we allow arbitrary tables to specify conditional dependences. In general, there are multiple solutions to the mean-field equations. We show that improved estimates can be obtained by forming a weighted mixture of the multiple mean-field solutions. Simple approximate expressions for the mixture weights are given. The general formalism derived so far is evaluated for the special case of Bayesian networks. The benefits of taking into account multiple solutions are demonstrated by using MFT for inference in a small and in a very large Bayesian network. The results are compared with the exact results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call