Abstract

Aspect-level sentiment classification (ASC) is designed to identify the sentiment orientation of given aspect terms in a sentence. Previous neural networks have used attention mechanisms to align context words with the appropriate aspect terms. Without considering syntactic dependencies, these models may erroneously focus on context words that are not related to the aspect terms. To address this issue, the graph convolution network (GCN) and the graph attention network (GAT) are proposed to build a graph based on the dependency parse tree, allowing the representations of context words to be propagated to the aspect terms according to their syntactic dependencies. However, these models consider all syntactic dependencies to be of the same type, and thus may result in inappropriate propagation of word representations in the graph. To further distinguish between the syntactic dependencies, this study proposes a syntactic graph attention network (SGAN) to incorporate the knowledge of dependency types into the graph attention network. The dependency types are modeled as edge embeddings to learn the attention weight of each edge according to its dependency type. By considering different dependency types and their weights, the proposed method can block inappropriate propagation to better associate the context words to aspect terms. To increase training process stability and enrich the diversity of graph representations, a weighted multihead attention is applied to compose the graph representations generated by different heads. The experimental results on five benchmark datasets show that the SGAN yields more accurate results than existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call