Abstract

Knowledge-driven dialogue systems have recently made remarkable breakthroughs. Compared with general dialogue systems, superior knowledge-driven dialogue systems can generate more informative and knowledgeable responses with pre-provided knowledge. However, in practical applications, the knowledge-driven dialogue systems cannot be supplied with relevant knowledge beforehand. Hence, to enhance the practicality of the knowledge-driven dialogue systems, it is crucial to devise a method to dynamically retrieve pertinent knowledge based on the context. In addressing this challenge, we introduce a knowledge-driven dialogue system called DRKQG (Dynamically Retrieving Knowledge via Query Generation for informative dialogue response). Specifically, this system is composed of two main modules: a query generation module and a response generation module. Initially, a time-aware mechanism is employed to capture contextual information, enabling the generation of a query for knowledge retrieval through a search engine. Subsequently, we incorporate the copy mechanism and transformers, empowering the response generation module to create responses based on both the context and retrieved knowledge. Experimental results at LIC2022, Language and Intelligence Technology Competition, show that our module outperforms the baseline model by a large margin on automatic evaluation metrics, while human evaluation by the Baidu Linguistics team shows that our system achieves impressive results in Factually Correct and Knowledgeable.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call