Abstract

Currently, dialogue systems have attracted increasing research interest. In particular, background knowledge is incorporated to improve the performance of dialogue systems. Existing dialogue systems mostly assume that the background knowledge is correct and comprehensive. However, low-quality background knowledge is common in real-world applications. On the other hand, dialogue datasets with manual labeled background knowledge are often insufficient. To tackle these challenges, this article presents an algorithm to revise low-quality background knowledge, named background knowledge revising transformer (BKR-Transformer). By innovatively formulating the knowledge revising task as a sequence-to-sequence (Seq2Seq) problem, BKR-Transformer generates the revised background knowledge based on the original background knowledge and dialogue history. More importantly, to alleviate the effect of insufficient training data, BKR-Transformer introduces the ideas of parameter sharing and tensor decomposition, which could significantly reduce the number of model parameters. Furthermore, this work presents a background knowledge revising and incorporating dialogue model that combines the background knowledge revision with response selection in a unified model. Empirical analyses on real-world applications demonstrate that the proposed background knowledge revising and incorporating dialogue system (BKRI) could revise most low-quality background knowledge and substantially outperforms previous dialogue models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call