Abstract

The naive Bayes is one of the useful classification techniques in data mining and machine learning. Although naive Bayes learners are efficient, they suffer from the weak assumption of conditional independence between the attributes. Many algorithms have been proposed to improve the effectiveness of naive Bayes classifier by inserting discriminant approaches into its generative structure. Combining generative and discriminative viewpoints is done in many algorithms e.g. by use of attribute weighting, instance weighting or ensemble method. In this paper, a new ensemble of Gaussian naive Bayes classifiers is proposed based on the mixture of Gaussian distributions formed on less conditional dependent features extracted by local PCA. A semi-AdaBoost approach is used for dynamic adaptation of distributions considering misclassified instances. The proposed method has been evaluated and compared with the related work on 12 UCI machine learning datasets and achievements show significant improvement on the performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call