Abstract

Often, a Bayesian network contains both discrete and continuous random variables. Although many Bayesian network inference packages allow the user to specify both continuous variables and discrete variables in the same network, simpler and better inference results can be obtained by representing the variables as discrete. Standard application of Bayes' Theorem is inference in a two-node Bayesian network. Larger Bayesian networks address the problem of representing the joint probability distribution of a large number of variables. The chapter presents some examples illustrating how the conditional independencies entailed by the Markov condition can be exploited to accomplish inference in a Bayesian network. A directed acyclic graph (DAG) entails a conditional independency if every probability distribution, which satisfies the Markov condition with the DAG, must have the conditional independency. Conditional probability distributions can either be obtained from the subjective judgements of an expert in the area or learned from data. The chapter discusses two techniques for simplifying the process of ascertaining them. The first technique concerns the case where a node has multiple parents, while the second technique concerns nodes that represent continuous random variables. The noisy OR-gate model concerns the case where the relationships among variables ordinarily represent causal influences, and each variable has only two values.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call