Abstract

Previous multi-turn dialogue approaches based on global Knowledge Graphs (KGs) still suffer from generic, uncontrollable, and incoherent responses generation. Most of them neither consider the local topic-level semantic information of KGs nor effectively merge the information of long dialogue contexts and KGs into the dialogue generation. To tackle these issues, we propose a Topic-level Knowledge-aware Dialogue Generation model to capture context-aware topic-level knowledge information. Our method thus accounts for topic-coherence, fluency, and diversity of generated responses. Specifically, we first decompose the given KG into a set of topic-level sub-graphs, with each sub-graph capturing a semantic component of the input KG. Furthermore, we design a Topic-level Sub-graphs Attention Network to calculate the comprehensive representation of both sub-graphs and previous turns of dialogue utterances, which then decoded with the current turn into a response. By using sub-graphs, our model is able to attend to different topical components of the KG and enhance the topic-coherence. We perform extensive experiments on two datasets of DuRecDial and KdConv to demonstrate the effectiveness of our model. The experimental results demonstrate that our model outperforms existing strong baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call