Abstract

A new bootstrap-aggregating (bagging) ensemble learning algorithm is proposed based on classification certainty and semantic correlation to improve the classification accuracy of ensemble learning. First, two predetermined thresholds are introduced to construct the long and short-text sample subsets, and different deep learning methods are compared to construct the optimal base classifier groups for each sample subsets. Then, the random sampling method employed in traditional bagging classification algorithms is improved, and a threshold group based random sampling method is proposed to obtain long and short training sample subsets of each iteration. Finally, the sample classification certainty of the base classifiers for different categories is defined, and the semantic correlation information is integrated with the traditional weighted voting classifier ensemble method to avoid the loss of important information during the sampling process. The experimental results on multiple datasets demonstrate that the algorithm significantly improves text classification accuracy and outperforms typical deep learning algorithms. The proposed algorithm achieves the improvements of approximately 0.082, 0.061 and 0.019 on CNews dataset when the F1 measurement is used over the traditional ensemble learning algorithms such as random forest, M_ADA_A_SMV and CNN_SVM_LR. Moreover, it achieves the best F1 values of 0.995, 0.985, and 0.989 on the datasets of Spam, CNews, and SogouCS datasets, respectively, when compared with the ensemble learning algorithms using different base classifiers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call