Abstract

Aspect-based sentiment analysis (ABSA) is a kind of fine-grained task that predicts sentiment polarities for the given aspects in a sentence. Recently, graph-based methods such as graph convolutional networks (GCNs), graph attention networks (GATs) and their extensions have been used to model the syntactic structures of sentences and have achieved promising results. However, these graph-based methods generally homogenize dependencies and result in the loss of syntactic information. In addition, models that use only syntax-based graph methods are unable to identify the sentiments in informal reviews. In this study, we propose a dependency feature-aware hierarchical dual-graph convolutional network (HD-GCN) model, which consists of two specifically designed GCNs for extracting syntactic and semantic information. Specifically, for semantic information extraction, we leverage a self-attention matrix to represent the correlations between nodes. For syntactic information, we design a dependency feature-aware GCN that adaptively learns the mappings of various dependencies in the adjacency matrix. Moreover, we stack the network into multiple layers, and each layer learns different relation matrices to better obtain different levels of linguistic features. Extensive experiments are conducted on the SemEval 2014 and Twitter datasets, and the obtained experimental results are superior to those of the state-of-the-art models on the ABSA task, proving the effectiveness of our model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call