Abstract

Abstract This study proposes an emotional analysis method of consumer comment text based on Bidirectional Encoder Representations from Transformers (BERT) and hierarchical attention. First, using the BERT pre-training model, the left and right contextual information is fused to enhance the semantic representation of words and generate dynamic word vectors containing contextual semantics. Second, the bidirectional long short-term memory network is used to obtain the sequence feature matrix, and the sentence representation and the text representation are obtained using the two-layer long short-term memory. Finally, the local attention mechanism and the global attention mechanism are introduced into the sentence representation layer and the text representation layer, respectively, and the text emotion of consumer comments is classified by softmax. Experiments show that the accuracy of the proposed method in Laptop data set is 93.01% and that in Restaurant data set is 92.45%. Therefore, the performance of the proposed method in the emotional analysis of consumer comment text is significantly better than that of the comparison method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call