Abstract

The goal of Aspect-level Sentiment Classification (ASC) is to identify the sentiment polarity towards a specific aspect of a given sentence. Mainstream methods design complicated models and require a large scale of annotated training samples, and even perform finetuning based on pre-trained language models. Therefore, those supervised methods may be impractical in real-world scenarios due to the limited availability of labeled training corpora, a.k.a. low-resource settings. To this end, we propose an aspect-specific prompt learning approach (AS-Prompt) that fully utilizes the pre-trained knowledge and aspect-related information to deal with ASC tasks, enabling pre-trained models with huge parameters to achieve considerable results under the few-shot settings. Specifically, we transfer the sentiment classification task into Masked Language Modeling (MLM) by designing appropriate prompts and searching for the ideal expression of prompts in continuous space. Meanwhile, we integrate the prompts into the input sentence, thus adapting the model to the classification task under the guidance of sentiment labels. Experimental results on SemEval-2014 Task 4 show our proposed method achieves noticeably improvement compared with the original BERT models and discrete prompt methods. In addition, we test the performance of the model’s transfer on different datasets and demonstrate the superiority of prompt learning when adapting to a new domain, especially under a low-resource setting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call