Abstract

Sentiment analysis techniques are becoming more and more important as the number of reviews on the World Wide Web keeps increasing. Aspect-based sentiment analysis (ABSA) entails the automatic analysis of sentiments at the highly fine-grained aspect level. One of the challenges of ABSA is to identify the correct sentiment expressed towards every aspect in a sentence. In this paper, a neural attention model is discussed and three extensions are proposed to this model. First, the strengths and weaknesses of the highly successful CABASC model are discussed, and three shortcomings are identified: the aspect-representation is poor, the current attention mechanism can be extended for dealing with polysemy in natural language, and the design of the aspect-specific sentence representation is upheld by a weak construction. We propose the Extended CABASC (E-CABASC) model, which aims to solve all three of these problems. The model incorporates a context-aware aspect representation, a multi-dimensional attention mechanism, and an aspect-specific sentence representation. The main contribution of this work is that it is shown that attention models can be improved upon using some relatively simple extensions, such as fusion gates and multi-dimensional attention, which can be implemented in many state-of-the-art models. Additionally, an analysis of the parameters and attention weights is provided.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call