Abstract

Knowledge-grounded dialogue is a task that utilizes external knowledge to generate appropriate and fluent responses to statements. Owing to the relevance of generating responses based on relevant knowledge in diverse fields, the knowledge selection task has been spotlighted. In this study, we propose a novel selection model that applies contrastive-learning with negative sampling loss to create dialogue-centric representation of knowledge. A two-part loss is considered knowledge selection loss and topic prediction loss. The former increases the similarity between content representations of related knowledge and dialogue history, while the latter increases the similarity between their topic representations. The proposed model was evaluated on two well-known datasets,Wizard of Wikipedia and Holl-E, in terms of the knowledge-grounded dialogue task exhibiting remarkable improvement over previously proposed methods on both knowledge selection and response generation tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call