Abstract
Abstract This study proposes an emotional analysis method of consumer comment text based on Bidirectional Encoder Representations from Transformers (BERT) and hierarchical attention. First, using the BERT pre-training model, the left and right contextual information is fused to enhance the semantic representation of words and generate dynamic word vectors containing contextual semantics. Second, the bidirectional long short-term memory network is used to obtain the sequence feature matrix, and the sentence representation and the text representation are obtained using the two-layer long short-term memory. Finally, the local attention mechanism and the global attention mechanism are introduced into the sentence representation layer and the text representation layer, respectively, and the text emotion of consumer comments is classified by softmax. Experiments show that the accuracy of the proposed method in Laptop data set is 93.01% and that in Restaurant data set is 92.45%. Therefore, the performance of the proposed method in the emotional analysis of consumer comment text is significantly better than that of the comparison method.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.