Abstract

Neural-based text classification methods have attracted increasing attention in recent years. Unlike the standard text classification methods, neural-based text classification methods perform the representation operation and end-to-end learning on the text data. Many useful insights can be derived from neural based text classifiers as demonstrated by an ever-growing body of work focused on text mining. However, in the real-world, text can be both complex and noisy which can pose a problem for effective text classification. An effective way to deal with this issue is to incorporate self-attention and capsule networks into text mining solutions. In this paper, we propose a Bi-dynamic routing Capsule Network with Label-constraint (BCNL) model for text classification, which moves beyond the limitations of previous methods by automatically learning the task-relevant and label-relevant words of text. Specifically, we use a Bi-LSTM and self-attention with position encoder network to learn text embeddings. Meanwhile, we propose a bi-dynamic routing capsule network with label-constraint to adjust the category distribute of text capsules. Through extensive experiments on four datasets, we observe that our method outperforms state-of-the-art baseline methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call