Abstract

Considering that most popular models solving aspect-level sentiment classification problems mainly focus on designing complicated neural networks to scale the importance of each word in the sentence, this paper addresses this problem from the view of semantic space. Motivated by the fact that the senses of a word can be sophisticatedly embedded into the semantic space using a distributed representation, this paper hypothesizes that each sense of a word can be represented by one or more specific dimensions, and thus the target of aspect-level sentiment classification can be simplified into searching the related dimensions for the aspects and sentiments concerned. Particularly, an Attention Vector (ATV) based on attention mechanism is designed for each aspect in terms of a specific task, which involves two sub-vectors, i.e., a Dimension Attention Vector (DATV) and a Sentiment Attention Vector (SATV). The DATV determines the significances of different dimensions based on their correlations with an aspect; and the SATV allocates weights for the attributes of words, which are decided by sentiment polarities and part-of-speech (PoS) tagging. Given a sub-dataset related to an aspect, the ATV will be optimized by an Artificial Bee Colony (ABC) algorithm with a Support Vector Machine (SVM) classifier, the objective of which is to maximize classification accuracy. Intrinsically, the DATV can reduce the ambiguity existing in polysemy, meanwhile, the SATV is an auxiliary means for the optimization of the DATV, which can help eliminate the misunderstandings caused by antonyms. Then, the optimized DATV will be applied on a Convolutional Neural Network (CNN) model via simply scaling the pretrained word embeddings as inputs (named as ATV-CNN model). Experimental results show that the ATV-CNN model can have substantial advantages when compared with state-of-the-art models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.