Abstract

Convolutional neural networks (CNN) and recurrent neural networks (RNN) have proven to be effective text model representations in text sentiment analysis. However, CNN only considers the local information between consecutive words, ignoring the context-dependent information of long distance between words, and the pooling operation will lose some semantic information in the process of forward propagation; RNN does not show a significant advantage over CNN in the case of short text lengths. This paper extracts text features by integrating attention based Bi-directional long short-term memory networks (AttBiLSTM) and MCNN (Multi-channel CNN) networks to improve the text representation of the model. Contrasting experiments on the two datasets show that the hybrid model effectively improves the accuracy of text classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call