Abstract

When using the naïve Bayesian classifier (NBC), it is necessary to assume that the attributes are independent. Although the relationship between the attributes can be analyzed by the Bayesian network (BN), the network structure and parameters need to be trained. Meanwhile, each continuous attribute is given a priori probability distribution for both the NBC and the BN. If the actual probability distribution of the samples is inconsistent with the priori probability distribution, the classification result is poor. In this paper, a model-free Bayesian classifier (MFBC) is proposed to cope with the drawbacks of the NBC and the BN. The information of the sample space and the joint probability density can be obtained based on the nearest neighbor (NN) strategy. The proposed MFBC is able to cope with discrete or continuous attributes even if there is no a priori probability distribution of the attributes. It is also not necessary to build a network structure to obtain a unitive calculation framework for the attributes. Moreover, the probability distribution for limited amounts of samples is approximated using the NN method. Then, the classification label of a sample can be predicted by the MFBC. The sensitivity analysis of the MFBC is obtained using the UCI data sets. Compared with some classical and ensemble classifiers, the analysis results verify the effectiveness and convergence performance of the MFBC.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call