Abstract

Within the field of supervised classification, the naïve Bayes (NB) classifier is a very simple and fast classification method that obtains good results, being even comparable with much more complex models. It has been proved that the NB model is strongly dependent on the estimation of conditional probabilities. In the literature, it had been shown that the classical and Laplace estimations of probabilities have some drawbacks and it was proposed a NB model that takes into account the a priori probabilities in order to estimate the conditional probabilities, which was called m-probability-estimation. With a very scarce experimentation, this approximation based on m-probability-estimation demonstrated to provide better results than NB with classical and Laplace estimations of probabilities. In this research, a new naïve Bayes variation is proposed, which is based on the m-probability-estimation version and takes into account imprecise probabilities in order to calculate the a priori probabilities. An exhaustive experimental research is carried out, with a large number of data sets and different levels of class noise. From this experimentation, we can conclude that the proposed NB model and the m-probability-estimation approach provide better results than NB with classical and Laplace estimation of probabilities. It will be also shown that the proposed NB implies an improvement over the m-probability-estimation model, especially when there is some class noise.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call