Abstract

Aspect Level Sentiment Classification (ALSC) deals with classifying the sentiment polarity towards a particular aspect or target. The performance of any method in ALSC is primarily driven by its capability to map the aspect or target term to the correct sentiment context words. The dependency tree's syntactical information is crucial for correctly mapping the appropriate context terms to the aspect term. Thus, recent top-performing methods use the graph neural network (GNN) to incorporate the syntactical knowledge of the dependency tree. However, the architecture of such methods is quite complex with the computationally expensive GNNs. In this work, we propose a method syntactic neighbour-based attention network (SNBAN), that is architecturally simple, takes less computational time and is efficient. SNBAN has a novel and simple architecture that is based on Bi-GRU (Bi-directional Gated Recurrent unit). Specifically, SNBAN can exploit both semantic and syntactic knowledge of the input sentence using two muti-head attentions (MHATT) where the first and second MHATT handle the semantic and syntactic knowledge of the input sentence respectively. The experiments show that the proposed SNBAN significantly outperforms the non-syntactic knowledge-based baselines. The experiments also show that the proposed SNBAN takes a significantly reduced average training time in comparison to GNNs and at the same time performs withcomparableaccuracy and F1 score against the GNN based methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.