Abstract

AbstractConcerning the problems that the traditional Convolutional Neural Network (CNN) ignores contextual semantic information, and the traditional Recurrent Neural Network (RNN) has information memory loss and vanishing gradient, this paper proposes a Bi‐directional Encoder Representations from Transformers (BERT)‐based dual‐channel parallel hybrid neural network model for text sentiment analysis. The BERT model is used to convert text into word vectors; the dual‐channel parallel hybrid neural network model constructed by CNN and Bi‐directional Long Short‐Term Memory (BiLSTM) extracts local and global semantic features of the text, which can obtain more comprehensive sentiment features; the attention mechanism enables some words to get more attention that highlights important words and improves the model's sentiment classification ability. Finally, the dual‐channel output features are fused for sentiment classification. The experimental results on the hotel review datasets show that the Accuracy of the proposed model in sentiment classification reaches 92.35% and the F1 score reaches 91.59%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call