Abstract

Aspect-level sentiment classification still remains a challenge: how to capture contextual semantic correlation between aspect word and content words more effectively. LSTM-SNP is a variant of long short-term memory (LSTM), inspired from nonlinear spiking mechanisms in nonlinear spiking neural P systems. To address the challenge, we develop a modification of LSTM-SNP and design a bidirectional LSTM-SNP based on the modification, termed as BiLSTM-SNP. Based on BiLSTM-SNP and attention mechanism, we propose a novel method for aspect-level sentiment classification. BiLSTM-SNP is used to capture semantic correlation between aspect word and content words, while attention mechanism is utilized to generate appropriate attention weights for hidden states of BiLSTM-SNP. Experiments on English and Chinese data sets are conducted on the proposed model and several baseline models. Experiment results on English and Chinese data sets demonstrate the effectiveness of the proposed model for aspect-level sentiment classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call