Abstract

Learning large Bayesian networks (BN) from data is a challenging problem due to the vastness of the structure space. An effective way to turn this problem affordable is the use of super-structures—SS (undirected graphs that contain the BN skeleton). However, the literature has been lacking of specialized methods for estimating SS. We present here two algorithms intended for such purpose in the hybrid approach of BN structure learning. The first one, called Opt01SS, learns SS using only zero-and-first-order conditional independence (CI) tests in a way that allows dealing with the presence of approximate-deterministic relationships and inconsistent CIs, commonly found in small samples. The second algorithm, called OptHPC, is a computational optimized version of the recent HPC algorithm (De Morais and Aussem 2010, [17]) that showed an attractive accuracy for SS recovery. Results on various benchmark networks showed that the proposed algorithms achieve a balance between sensitivity and specificity clearly more favorable for the task of SS estimation than several representative state-of-the-art methods. The computational cost was also found to be reasonable, being Opt01SS one of the most competitive among the analyzed algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call