Abstract

Naive Bayes (NB) learning is more popular, faster and effective supervised learning method to handle the labeled datasets especially in which have some noises, NB learning also has well performance. However, the conditional independent assumption of NB learning imposes some restriction on the property of handling data of real world. Some researchers proposed lots of methods to relax NB assumption, those methods also include attribute weighting, kernel density estimating. In this paper, we propose a novel approach called NB Based on Attribute Weighting in Kernel Density Estimation (NBAWKDE) to improve the NB learning classification ability via combining kernel density estimation and attribute weighting based on conditional mutual information. Our method makes the weights embedded in kernel have the relatively interpretable meaning, it is flexible that we also can choice different metrics and methods to measure the weights based on our attribute weighting in kernel density estimation framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call