Abstract

Naive Bayes (NB) is a probability-based classification model which is based on the attribute independence assumption. However, in many real-world data mining applications, its attribute independence assumption is often violated. Responding to this fact, researchers have made a substantial amount of effort to improve the classification accuracy of NB by weakening its attribute independence assumption. For a recent example, averaged one-dependence estimators (AODE) is proposed, which weakens its attribute independence assumption by averaging all models from a restricted class of one-dependence classifiers. However, all one-dependence classifiers in AODE have same weights and are treated equally. According to our observation, different one-dependence classifiers should have different weights. Therefore, in this article, we proposed an improved model called weighted average of one-dependence estimators (WAODE) by assigning different weights to these one-dependence classifiers. In our WAODE, four different weighting approaches are designed and thus four different versions are created. For simplicity, we respectively denote them by WAODE-MI, WAODE-ACC, WAODE-CLL and WAODE-AUC. The experimental results on a large number of UCI datasets published on the main website of Weka platform show that our WAODE significantly outperform AODE. †This article is an extended version of PRICAI 2006 conference paper (Jiang and Zhang 2006).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.