Abstract
AbstractConcerning the problems that the traditional Convolutional Neural Network (CNN) ignores contextual semantic information, and the traditional Recurrent Neural Network (RNN) has information memory loss and vanishing gradient, this paper proposes a Bi‐directional Encoder Representations from Transformers (BERT)‐based dual‐channel parallel hybrid neural network model for text sentiment analysis. The BERT model is used to convert text into word vectors; the dual‐channel parallel hybrid neural network model constructed by CNN and Bi‐directional Long Short‐Term Memory (BiLSTM) extracts local and global semantic features of the text, which can obtain more comprehensive sentiment features; the attention mechanism enables some words to get more attention that highlights important words and improves the model's sentiment classification ability. Finally, the dual‐channel output features are fused for sentiment classification. The experimental results on the hotel review datasets show that the Accuracy of the proposed model in sentiment classification reaches 92.35% and the F1 score reaches 91.59%.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.