Abstract

Currently, text classification studies mainly focus on training classifiers by using textual input only, or enhancing semantic features by introducing external knowledge (e.g., hand-craft lexicons and domain knowledge). In contrast, some intrinsic statistical features of the corpus, like word frequency and distribution over labels, are not well exploited. Compared with external knowledge, the statistical features are deterministic and naturally compatible with corresponding tasks. In this paper, we propose an Adaptive Gate Network (AGN) to consolidate semantic representation with statistical features selectively. In particular, AGN encodes statistical features through a variational component and merges information via a well-designed valve mechanism. The valve adapts the information flow into the classifier according to the confidence of semantic features in decision making, which can facilitate training a robust classifier and can address the overfitting caused by using statistical features. Extensive experiments on datasets of various scales show that, by incorporating statistical information, AGN can improve the classification performance of CNN, RNN, Transformer, and Bert based models effectively. The experiments also indicate the robustness of AGN against adversarial attacks of manipulating statistical information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call