Abstract

Discovering Out-of-Domain (OOD) intents is essential for developing new skills in a task-oriented dialogue system. Previous methods suffer from poor knowledge transferability from in-domain (IND) intents to OOD intents, and inefficient iterative clustering. In this paper, we propose an efficient unified contrastive learning framework to discover OOD intents, bridging the gap between IND pre-training stage and OOD clustering stage. Specifically, we employ a supervised contrastive learning (SCL) objective to learn discriminative pre-trained intent features for clustering. And we introduce an efficient end-to-end contrastive clustering method to jointly learn representations and cluster assignments. Besides, we propose an adaptive contrastive learning (ACL) method to automatically adjust the weights of different negative sample pairs for a given anchor according to their semantic similarities. Extensive experiments on two benchmark datasets show that our method is more robust and achieves substantial improvements over the state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call