Abstract

Aspect-based sentiment analysis (ABSA) is a fine-grained task that detects the sentiment polarities of particular aspect words in a sentence. With the rise of graph convolution networks (GCNs), current ABSA models mostly use graph-based methods. These methods construct a dependency tree for each sentence, and regard each word as a unique node. To be more specific, they conduct classification using aspect representations instead of sentence representations, and update them with GCNs. However, this kind of method relies too much on the quality of the dependency tree and may lose the global sentence information, which is also helpful for classification. To deal with these, we design a new ABSA model AG-VSR. Two kinds of representations are proposed to perform the final classification, Attention-assisted Graph-based Representation (A2GR) and Variational Sentence Representation (VSR). A2GR is produced by the GCN module, which inputs a dependency tree modified by the attention mechanism. Furthermore, VSR is sampled from a distribution learned by a VAE-like encoder–decoder structure. Extensive experiments show that our model AG-VSR achieves competitive results. Our code and data have been released in https://github.com/wangbing1416/VAGR.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call