Abstract

AbstractWith the development of social media, the scale of comment text on network platform is growing rapidly. How to judge the emotional polarity of comments quickly and accurately is of great significance in the fields of commodity review and public opinion monitoring. Short text reviews have the problem of sparse features. At present, most of the sentiment analysis models use TF-IDF, word2vec and groove to obtain word vector representation, which can’t fully express the context semantic information of the text. Moreover, the mainstream bidirectional recurrent neural network model relies on the sequence information to a large extent, so it is difficult to pay attention to the important information of the text. To solve this problem, this paper proposes a sentiment analysis method which combines the best word vector and the transformer self-attention model. The rich semantic representation information is obtained through the Bert model, and the feature weight is dynamically adjusted by the self-attention model to obtain the sentiment classification of short text. The experimental results show that the Bert word vector can enhance the semantic representation of the text, and the self-attention model can reduce the dependence of external parameters, which makes the model pay more attention to its own key information, and the classification performance is significantly improved. There are two innovations in this paper. Firstly, the problem of sparse features of short text is solved by using the Bert word vector to improve the semantic expression of the text and obtain more comprehensive semantic information; secondly, the feature information is weighted by self-attention mechanism to highlight the important information of the text.KeywordsSentiment analysisBertSelf-attentionTransformer

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call