Abstract

Bayesian networks for the static as well as for the dynamic cases have been the subject of a great deal of theoretical analysis and practical inference approximations in the research community of artificial intelligence, machine learning and pattern recognition. After exploring the quite well known theory of discrete and continuous Bayesian networks, we introduce an almost instant reasoning scheme to the hybrid Bayesian networks. In addition to illustrate the similarities of the dynamic Bayesian networks (DBN) and the Kalman filter, we present a computationally efficient approach for the inference problem of hybrid dynamic Bayesian networks (HDBN). The proposed method is based on the separations of the dynamic and static nodes, and following hypercubic partitions via the Decision tree algorithm (DT). Experiments show that with high statistical confidence the novel algorithm used in the HDBN performs favorably in the tradeoffs of computational complexities and accuracy performance when compared to Junction tree and Gaussian mixture models on the task of classifications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call