Abstract

Aspect-level sentiment analysis aims to predict the sentiment orientation of a specific aspect in one context. Recent studies have achieved great success in modeling aspect and context by applying long–short term memory networks (LSTMs) and an attention mechanism. However, the semantic correlations between each word of the aspect and context fail to be noticed, which decreases the effectiveness of feature representations. Given this problem, in order to comprehensively analyze semantic correlations from the perspective of the word level and feature level, a co-attention mechanism is proposed to capture the interactions between aspect and context, which interactively concentrates the semantic influences on context and aspect to generate greater informative representation. Specifically, a co-attention mechanism consists of the 1-pair hop mechanism and an interactive mechanism, in which the 1-pair hop mechanism pays more attention to the important word’s aspect or context and the interactive mechanism highlights the significant feature of the aspect or context by calculating the interactive attention matrix from the perspective of the feature level. In addition, considering that one context contains more than one aspect, the novel loss function is designed to fully employ the attention weights of different aspects on every word in the given context. Extensive compared experiments are conducted based on the GloVe and BERT pre-trained models, and the results show that the proposed method can achieve state-of-the-art performance on the Restaurant and Twitter datasets. Furthermore, ablation studies are designed to validate the necessity and importance of the 1-pair hop mechanism and interactive mechanism.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call