Abstract

Short text is an important form of information dissemination and opinion expression in various social media platforms. Sentiment analysis of short texts is beneficial for the understanding of customers' emotional state, obtaining customers' opinions and attitudes toward events, information and products, however, is difficult because the sparsity of the short-text data. Unlike the traditional methods using the external knowledge, this paper proposes a bi-level attention model for sentiment analysis of short texts, which does not rely on external knowledge to deal with the data sparsity. Specifically, at word level, our model improves the effect of word representation by introducing latent topic information into word-level semantic representation. Neural topic model is used to discover the latent topic of the text. A new topic-word attention mechanism is presented to explore the semantics of words from the perspective of topic-word association; At the sequence level, a secondary attention mechanism is used to capture the relationship between local and global sentiment expression. Experiments on the ChnSentiCorp-Htl-ba-10000 and NLPCC-ECGC datasets validate the effectiveness of the BAM model.

Highlights

  • Information dissemination, opinion expression, and other behaviors are increasingly presented in the form of short texts in various social media platforms, emerging news media, e-commerce, and other fields [1]

  • Experiments on the ChnSentiCorp-Htl-ba-10000 and NLPCC-ECGC datasets validate the effectiveness of the bi-level attention model (BAM) model

  • Pl,k measures how well word embedding ul can match topic vector tk, thereby in a certain degree reflect the correlation between word and topics, we argue that more topic information could be add as topic component θk involved; αl,k refers to the relationship between the sequence word at position l and k th latent topics; ul refers to the l st row vector of matrix U ; And tk refers to the k th row vector of the topic vector matrix T

Read more

Summary

INTRODUCTION

Information dissemination, opinion expression, and other behaviors are increasingly presented in the form of short texts in various social media platforms, emerging news media, e-commerce, and other fields [1]. Difficulties in sentiment analysis caused by data sparsity [5]–[7] These methods are limited in the scope of application scope because numerous manual features are required or depend on high-quality external knowledge base in specific fields [5], [9], [10]. W. Liu et al.: Bi-Level Attention Model for Sentiment Analysis of Short Texts bi-level attention model on the bases of topic and sequence. Liu et al.: Bi-Level Attention Model for Sentiment Analysis of Short Texts bi-level attention model on the bases of topic and sequence This method does not introduce external knowledge to assist the comprehension of word meaning. An end-to-end short-text sentiment analysis method based on bi-level attention model (BAM) is presented.

RELATED WORKS
TOPIC-WORD ATTENTION MECHENISM
SEQUENCE-LEVEL ATTENTION MECHANISM
CASE STUDY
CONCLUSION

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.