Abstract

Bayesian network is a graphical model that is widely used to perform probabilistic reasoning. However, learning the structure of Bayesian network is a complex task. In this paper, we propose a hybrid structure learning algorithm that has two phases: a constraint-based phase to reduce the search space and a score-and-search phase that employs case-injected genetic algorithms for determining the optimal structure from the reduced space of structures. We use a case-injected genetic algorithm-based hybrid approach for the structure learning in order to improve the learning accuracy over similar problems. A case-injected genetic algorithm is the augmentation of a case-based memory with the Genetic Algorithm (GA). Thereby, it finds near-optimal solutions in fewer generations compared to GA. Our method stores relevant or partial solutions in a case-base while solving the problems and utilizes those stored solutions on new similar problems. We use small-to-very large networks for assessing our viability of our approach. In this paper, a series of experiments are conducted on datasets generated from four benchmark Bayesian networks. We compare our method against GA-based hybrid approach and a state-of-the-art algorithm, Max-Min Hill Climbing (MMHC). Presented results indicate an enhanced improvement of our approach over GA and MMHC in learning the Bayesian network structures.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.