Abstract

Naive Bayes (NB) network is a popular classification technique for data mining and machine learning. Many methods exist to improve the performance of NB by overcoming its primary weakness - the assumption that attributes are conditionally independent given the class, using techniques such as backwards sequential elimination and lazy elimination. Some weighting technologies, including attribute weighting and instance weighting, have also been proposed to improve the accuracy of NB. In this paper, we propose a dual weighted model, namely DWNB, for NB classification. In DWNB, we firstly employ an instance similarity based method to weight each training instance. After that, we build an attribute weighted model based on the new training data, where the calculation of the probability value is based on the embedded instance weights. The dual instance and attribute weighting allows DWNB to tackle the conditional independence assumption for accurate classification. Experiments and comparisons on 36 benchmark data sets demonstrate that DWNB outperforms existing weighted NB algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call