Abstract

Feature weighting is used to alleviate the conditional independence assumption of Naïive Bayes text classifiers and consequently improve their generalization performance. Most traditional feature weighting algorithms use general feature weighting, which assigns the same weight to each feature for all classes. We focus on class-specific feature weighting approaches, which discriminatively assign each feature a specific weight for each class. This paper uses a statistical feature weighting technique and proposes a new class-specific deep feature weighting method for Multinomial Naïve Bayes text classifiers. In this deep feature weighting method, feature weights are not only incorporated into the classification formulas but they are also incorporated into the conditional probability estimates of Multinomial Naïve Bayes text classifiers. Experimental results for a large number of text classification datasets validate the effectiveness and efficiency of our method.

Highlights

  • With the explosive growth of text information on the Internet, automated processing of massive text data has become a challenge

  • This study focuses on feature weighting approaches for multinomial Naïve Bayes (MNB) text classifiers

  • This study focuses on class-specific feature weighting approaches for Naïve Bayes text classifiers

Read more

Summary

Introduction

With the explosive growth of text information on the Internet, automated processing of massive text data has become a challenge. INDEX TERMS Multinomial Naïve Bayes text classifiers, class-specific feature weighting, statistic, deep feature weighting. S. Ruan et al.: Class-Specific Deep Feature Weighting for Naïve Bayes Text Classifiers weights to others.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call