Abstract

Aspect-level sentiment classification, a fine-grained sentiment analysis task which provides entire and intensive results, has been a research focus in recent years. However, the performance of neural network models is largely limited by the small scale of datasets for aspect-level sentiment classification due to the challenges to label such data. In this paper, we propose an aspect-level sentiment classification model based on Attention-Bidirectional Long Short-Term Memory (Attention-BiLSTM) model and transfer learning. Based on Attention-BiLSTM model, three models including Pre-training (PRET), Multitask learning (MTL), and Pre-training & Multitask learning (PRET+MTL) are proposed to transfer the knowledge obtained from document-level training of sentiment classification to aspect-level sentiment classification. Finally, the performance of the four models is verified on four datasets. Experiments show that proposed methods make up for the shortcomings of poor training of neural network models due to the small dataset of the aspect-level sentiment classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call