Abstract

Aspect-based sentiment analysis (ABSA) aims to automatically identify the sentiment polarity of specific aspect words in a given sentence or document. Existing studies have recognised the value of interactive learning in ABSA and have developed various methods to precisely model aspect words and their contexts through interactive learning. However, these methods mostly take a shallow interactive way to model aspect words and their contexts, which may lead to the lack of complex sentiment information. To solve this issue, we propose a Lightweight Multilayer Interactive Attention Network (LMIAN) for ABSA. Specifically, we first employ a pre-trained language model to initialise word embedding vectors. Second, an interactive computational layer is designed to build correlations between aspect words and their contexts. Such correlation degree is calculated by multiple computational layers with neural attention models. Third, we use a parameter-sharing strategy among the computational layers. This allows the model to learn complex sentiment features with lower memory costs. Finally, LMIAN conducts instance validation on six publicly available sentiment analysis datasets. Extensive experiments show that LMIAN performs better than other advanced methods with relatively low memory consumption.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call