Abstract

The recognition effect of news text sequence data is strongly related to the importance of each word and the dependency relationship between them. Although the capsule network can learn the correlation information between news text as a whole and local, it lacks the attention to the key words in the text and ignores the distant dependencies in the text. To remedy the above shortcomings, this paper proposes a news text classification model which is based on multi-head attention and parallel capsule networks, using a multi-head attention layer for feature extraction and then a parallel capsule network module as the classification layer. The model can retrieve wealthier text details. Experimental results demonstrate that the proposed model of this paper works better than the mainstream capsule network based text classification models in both single-label and multi-label classification tasks of news texts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call