Abstract

One of the Bayesian network classifiers widely used in the classification is K-dependence Bayesian (KDB). However, most of the KDB classifiers build a single network on a class variable without considering dependencies between features in each class. Moreover, many KDB classifiers need the discretization process to handle continuous features. This paper aims to propose a fast Multi-Network K-Dependence Bayesian (MNKDB) classifier for continuous features. According to this aim, we propose a non-parametric approach that efficiently identifies dependencies between continuous features in each class with a low computational cost and without discretizing continuous features. The results indicate that the MNKDB classifier is more accurate than the state-of-the-art KDB classifiers, especially for datasets with more than three classes. The MNKDB classifier not only decreases the classification time but also deals with continuous variables without discretizing them. The results for K=2 show that the MNKDB classifier is 36.5, 31.8, and 14.2 times faster and 4.13%, 5.15%, and 5.48% more accurate than the state-of-the-art FKDB (Flexible KDB), KMM-KDB (Kernel Mixture Model based on KDB), and SKDB (Scalable KDB) classifiers, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call