The Averaged One-Dependence Estimators (AODE) algorithm is an improvement of the naive Bayes algorithm, which allows all the attributes dependent on one common attribute, called parent attribute, thus forming One-Dependence Estimators (ODE). The classification probability is estimated by averaging the conditional probability of the ODE. When there is a dependency relationship between attributes, the AODE algorithm can better capture these relationships, thus improving classification performance. The AODE algorithm treats the parent and child attribute values equally in different ODEs. However, the parent attribute value and the child attribute value in different ODEs have different importance for classification. In this paper, two attribute value weighted AODE based on Kullback–Leibler divergence are proposed, one is parent attribute value weighted AODE, and the other is child attribute value weighted AODE. Comparative experiments were carried out with 30 datasets in the UCI database, and the experiments indicate that the performance of the parent attribute value weighted AODE algorithm with Kullback–Leibler divergence is significantly better than the original AODE algorithm, and its performance is also better than the mutual information weighted AODE algorithm.