Abstract

Naive Bayes (NB) continues to be one of the popular methods for text categorization because of its simplicity, efficiency and efficacy. In all of the existing feature weighting approaches, the learned feature weights are only applied to the classification of formula of Naive Bayes. In this paper, we propose a high efficient method which is called deep feature weighting Naive Bayes (DFWNB) [1]. DFWNB addresses to incorporate the learned weights into both the formula of classification and its conditional probability estimates. This paper defines the weight of each feature by TF-IDF feature weighting method. In the field of data mining, there are numerous studies about English text categorization but the Chinese text classification has received less attention from researchers. Thus, we apply the deep feature weighting naive Bayes to Chinese text classifiers and obtain a better performance than ordinary feature weight naive Bayes (OFWNB).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call