Abstract

Aspect-based sentiment classification aims to infer the sentiment expression towards a specific aspect in a sentence. The key to this task is to utilize the relationship between sentiment words and aspect words. The mainstream methods use Recurrent Neural Networks (RNN), Attention mechanisms, or Graph Neural Networks (GNN) to explore the syntactic information. Though these methods are undoubtedly effective, they still encounter several challenges: 1. Since most of the studies used only syntactic dependency graphs, they lacked a more optimal representation of inter-word relationships. 2. Some studies have explored multiple relationship graphs, but they fail to effectively integrate syntactic dependencies with semantic or other information, thereby impeding the exchange of multiple information elements. Moreover, the inclusion of more information graphs increases the computational burden on the model. In this paper, we construct a word-level relational hypergraph containing various syntactic and semantic relationships between aspect words and other context words. We propose an aspect-specific hypergraph attention network (ASHGAT) to thoroughly investigate the hypergraph’s information. Furthermore, we design an aspect-oriented syntactic distance-based weight distribution mechanism to optimize hypergraph attention. We conducted extensive experiments on four benchmark datasets from SemEval 14, 15, and 16. The results show that ASHGAT demonstrates the other SOTA baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call