Abstract

External document knowledge is helpful for dialogue systems to generate high-quality responses. Although several knowledge-grounded dialogue models have been designed, external knowledge cannot be comprehensively exploited due to the complex relationships among dialogue context, knowledge, and responses. To this end, we propose a novel transformer-based model, named TransIKG, which incorporates external document knowledge for dialogue generation. TransIKG comprises a two-step integration mechanism, including correlation integration and overall integration. Correlation integration is designed to fully exploit the pairwise mutual information among dialogue context, knowledge, and responses, while overall integration adopts an integration gate to capture global information. Furthermore, we utilize the positional information of dialogue turns to better represent the dialogue context and enhance the generalization ability of our model on out-of-domain documents. Finally, we propose a novel knowledge-aware pointer network to generate knowledge-enhanced response tokens. Experimental results on two benchmark datasets demonstrate that our model outperforms state-of-the-art models on both open-domain and domain-specific dialogues.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call