Abstract

Bayesian network classifiers (BNCs) are powerful tools in knowledge representation and inference under conditions of uncertainty. In contrast to eager learning, lazy learning seeks to improve the classification accuracy of BNCs by forming a decision theory that is especially tailored for the testing instance, whereas it has received less attention due to the high computational cost at classification time. This study introduces the conditionally independently and identically distributed (c.i.i.d.) assumption to BNCs by assuming that all instances of the same class are conditionally independent of each other and stem from the same probability distribution. Based on this premise, we propose a novel lazy BNC, semi-lazy Bayesian network classifier (SLB), which transforms each unlabeled testing instance to a series of complete instances with discriminative supposed class labels, and then builds class-specific local BNCs for each of them. Our experimental comparison on 25 UCI datasets shows that SLB has modest training time overheads and less classification time overheads. The Friedman and Nemenyi tests show that SLB has significant zero–one loss and bias advantages over some state-of-the-art BNCs, such as selective k-dependence Bayesian classifier, k-nearest neighbor, lazy Bayesian rule and average n-dependence estimators with lazy subsumption resolution.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.