Abstract

Emotion classification is an important task in natural language processing. Existing studies usually regard it as a multi-label classification task. However, they fail to effectively capture clause information and highlight weak (low-content) emotions that tend to be overwhelmed in co-existing emotions. To tackle these limitations, we propose a novel network “EduEmo”, which contains three parts: BERT-based encoder, Word-level attention layer, and RealFormer-based encoder. Specifically, BERT-based encoder models the associations between labels and words; Word-level attention layer captures the elementary discourse units (EDUs) representations that commonly contain single-emotion; RealFormer-based encoder leverages sparse attention to highlight the weak emotions and model the associations between labels and EDUs. In addition, we propose auxiliary-adversarial training algorithm, which adds perturbations to hard samples along the direction of gradient descent opposite to standard adversarial training. Experimental results on two benchmark datasets show that the proposed model outperforms favorably previous state-of-the-art methods. Experimental results on auxiliary-adversarial training indicate that the proposed training algorithm can further improve the generalization performance of adversarial training on emotion classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call