Abstract

Aspect-level sentiment classification is an interesting but challenging research problem, namely, the prediction of the sentiment polarity toward a specific aspect term of an opinionated sentence. Previous attention-based recurrent neural networks have been proposed to address this problem because attention mechanism is capable of finding out those words contributing more to the prediction than others and have shown great promise. However, the major drawback of these attention-based approaches is that the explicit position context is ignored. Drawing inspirations from the manner modeling the position context in information retrieval and question answering, we hypothesize that we should pay much more attention to the context words neighboring to the aspect than those far away, especially when one review sentence is a long sequence or contains multiple aspect terms. Based on this conjecture, in this paper, we put forward a new attentive LSTM model, dubbed PosATT-LSTM, which not only takes into account the importance of each context word but also incorporates the position-aware vectors, which represents the explicit position context between the aspect and its context words. We conduct substantial experiments on the SemEval 2014 datasets, and the encouraging results indicate the efficacy of our proposed approach.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call