Abstract

Abstract: Continual learning (CL) aims to learn a sequence of tasks, with task datasets emergingincrementally over time, and without a predetermined number of tasks. CL models strive to achievetwo primary objectives: preventing catastrophic forgetting and facilitating knowledge transfer between tasks. Catastrophic forgetting refers to the sharp decline in the performance of CL modelson previously learned tasks as new ones are learned. Knowledge transfer, which leverages acquiredknowledge from previous tasks, empowers the CL model to adeptly tackle new tasks. However, onlya few CL models proposed by far successfully achieve those two objectives simultaneously. In thispaper, we present a task-incremental CL based model that leverages a pre-trained language model(i.e., BERT) with injected CL-plugins to mitigate catastrophic forgetting in continual learning. Additionally, we propose the utilization of two contrastive learning-based losses, namely contrastiveensemble distillation (CED) and contrastive supervised learning of the current task (CSC) losses,to enhance our model’s performance. The CED loss improves the knowledge transferability of ourcontinual learning model, while the CSC loss enhances its performance for the current learning task.Experimental results on benchmark datasets demonstrate that our proposed model outperforms allexisting continual learning models in the task-incremental learning setting for continual aspect sentiment classification.Keywords: Continual Learning, Contrastive Learning, Aspect-Sentiment Classification.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call