Abstract

To address the problem that Word2vec static encoding cannot give accurate word vectors about contextual semantics and cannot solve the problem of multiple meanings of words, we propose to use the BERT pre-training model as a word embedding layer to obtain word vectors dynamically; we introduce the gating idea to improve on the traditional attention mechanism and propose BERT-BiGRU-GANet model. The model firstly uses the BERT pre-training model as the word vector layer to vectorize the input text by dynamic encoding; secondly, uses the bi-directional gated recursive unit model (BiGRU) to capture the dependencies between long discourse and further analyze the contextual semantics; finally, before output classification, adds the attention mechanism of fusion gating to ignore the features with little relevance and highlight the key features with weight ratio features. We conducted several comparison experiments on the Jingdong public product review dataset, and the model achieved an F1 value of 93.06%, which is 3.41%, 2.55%, and 1.12% more accurate than the BiLSTM, BiLSTM-Att, and BERT-BiGRU models, respectively. It indicates that the use of the BERT-BiGRU-GANet model has some improvement on Chinese text sentiment analysis, which is helpful in the analysis of goods and service reviews, for consumers to select goods, and for merchants to improve their goods or service reviews.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.