Abstract
When calculating the class-conditional probability of continuous attributes with naive Bayesian classifier (NBC) algorithm, the existing methods usually make use of the superposition of many normal distribution probability density functions to fit the true probability density function. Accordingly, the value of the class-conditional probability is equal to the sum of values of normal distribution probability density functions. In this paper, we propose a NPNBC model, i.e. the naive Bayesian classifier based on the neighborhood probability. In NPNBC, when calculating the class-conditional probability for a continuous attribute value in the given unknown example, a small neighborhood is created for the continuous attribute value in every normal distribution probability density function. So, the neighborhood probabilities for each normal distribution probability density function can be obtained. The sum of these neighborhood probabilities is the class-conditional probability for the continuous attribute value in NPNBC. Our experimental results demonstrate that NPNBC can obtain the remarkable performance in classification accuracy when compared with the normal method and the kernel method. In addition, we also investigate the relationship between the classification accuracy of NPNBC and the value of neighborhood.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.