Abstract

Text classification is one of the basic tasks of natural language processing. In recent years, deep learning has been widely used in text classification tasks. The representative one is the convolutional neural network. DPCNN is a deep convolutional neural network text classification model that can obtain long-distance text information, but it focuses on the extraction of global features and ignores the extraction of local features of the text. Some local feature information is very important and plays an important role in text classification tasks. Therefore, in this paper, a self-attention mechanism is introduced to extract local features of text based on the lack of extracting local features of DPCNN. In addition, although the deep convolutional neural network can extract deeper features, it is easy to cause the problem of gradient disappearance during training. Therefore, the highway network is introduced to prevent the problem of gradient disappearance caused by network training and improve the performance of the model. Experimental results show that the proposed model is better than a single DPCNN model, which further improves the accuracy of text classification tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call