Abstract
Aspect-based sentiment analysis (ABSA) aims at identifying sentiment polarities towards aspect in a sentence. Attention mechanism has played an important role in previous state-of-the-art neural models. However, existing attention mechanisms proposed for aspect based sentiment classification mostly focus on identifying the sentiment words, without considering the relevance of such words with respect to the given aspects in the sentence. To solve this problem, we propose a new architecture, self-attention with co-attention (SACA) for aspect-based sentiment analysis. Self-attention is capable of conducting direct connections between arbitrary two words in context from a global perspective, while co-attention can capture the word-level interaction between aspect and context. Moreover, previous works simply averaged aspect vector to learn the attention weights on the context words, which may bring information loss if the aspect has multiple words. To address the problem, we employ the pre-trained contextual word embeddings and character-level word embeddings as word representation. We evaluate the proposed approach on three datasets, experimental results demonstrate that our model outperforms the state-of-the-art on all three datasets.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have