Abstract

Aspect-level sentiment analysis (ALSA) aims to extract the polarity of different aspect terms in a sentence. Previous works leveraging traditional dependency syntax parsing trees (DSPT) to encode contextual syntactic information had obtained state-of-the-art results. However, these works may not be able to learn fine-grained syntactic knowledge efficiently, which makes them difficult to take advantage of local context. Furthermore, these works failed to exploit the dependency relation from DSPT sufficiently. To solve these problems, we propose a novel method to enhance local knowledge by using extensions of Local Context Network based on Proximity Values (LCPV) and Syntax-clusters Attention (SCA), named LCSA. LCPV first gets the induced trees from pre-trained models and generates the syntactic proximity values between context word and aspect to adaptively determine the extent of local context. Our improved SCA further extracts fine-grained knowledge, which not only focuses on the essential clusters for the target aspect term but also guides the model to learn essential words inside each cluster in DSPT. Extensive experimental results on multiple benchmark datasets demonstrate that LCSA is highly robust and achieves state-of-the-art performance for ALSA.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call