Abstract

Hybrid Bayesian Networks (HBNs), which contain both discrete and continuous variables, arise naturally in many application areas (e.g., image understanding, data fusion, medical diagnosis, fraud detection). This paper concerns inference in an important subclass of HBNs, the conditional Gaussian (CG) networks, in which all continuous random variables have Gaussian distributions and all children of continuous random variables must be continuous. Inference in CG networks can be NP-hard even for special-case structures, such as poly-trees, where inference in discrete Bayesian networks can be performed in polynomial time. Therefore, approximate inference is required. In approximate inference, it is often necessary to trade off accuracy against solution time. This paper presents an extension to the Hybrid Message Passing inference algorithm for general CG networks and an algorithm for optimizing its accuracy given a bound on computation time. The extended algorithm uses Gaussian mixture reduction to prevent an exponential increase in the number of Gaussian mixture components. The trade-off algorithm performs pre-processing to find optimal run-time settings for the extended algorithm. Experimental results for four CG networks compare performance of the extended algorithm with existing algorithms and show the optimal settings for these CG networks.

Highlights

  • A Bayesian Network (BN) [1] is a probabilistic graphical model that represents a joint distribution on a set of random variables in a compact form that exploits conditional independence relationships among the random variables

  • This paper presents an extension to the Hybrid Message Passing inference algorithm for general conditional Gaussian (CG) networks and an algorithm for optimizing its accuracy given a bound on computation time

  • Results for the conditional nonlinear Gaussian (CNG) networks showed similar patterns and are not shown here for brevity. These experiments showed that Hybrid Message Passing (HMP)-Gaussian mixture reduction (GMR) is scalable to large BNs for both linear and nonlinear CG

Read more

Summary

Introduction

A Bayesian Network (BN) [1] is a probabilistic graphical model that represents a joint distribution on a set of random variables in a compact form that exploits conditional independence relationships among the random variables. Because inference is exponential in the number of discrete nodes in a cluster, the algorithm is often intractable even when a tractable clustering approach exists for a discrete network of the same structure [4]. This paper presents a complete solution to the hybrid inference problem by providing two algorithms: Hybrid Message Passing (HMP) with Gaussian Mixture Reduction (GMR) and Optimal. The HMP-GMR algorithm prevents exponential growth of Gaussian mixture components in MP algorithms for inference in CG Bayesian networks. Accuracy and speed can depend on the Bayesian network and the specific pattern of evidence These characteristics can be used as guidance for choosing an inference method for a given problem. The HMP-GMR-OS algorithm is intended for cases in which a given HBN will be used repeatedly in a time-limited situation and a pre-processing step is desired to balance accuracy against speed of inference.

Structure of Hybrid Bayesian Network
Message Passing Inference for Discrete BN
Message Passing Inference for Continuous BNs
Message Passing Inference for Hybrid BNs
Gaussian Mixture Reduction
Extended Hybrid Message Passing Algorithm
Optimizing the Settings of HMP-GMR
Experiment
Scalability of HMP-GMR
Accuracy and Efficiency of HMP-GMR
Optimal Settings for HMP-GMR
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call