Abstract

External knowledge-enhanced task-oriented dialogue systems aim to cover user requests beyond pre-defined DBs/APIs. Recently, existing dialogue systems have focused more on retrieving external knowledge sources relevant to dialogue contexts, achieving competitive results. However, due to the lack of modeling entity-aware dialogue intention, such dialogue systems are hard to accurately and efficiently link the out-of-API functions in real-world scenarios. To tackle this problem, this paper investigates learning dense entity-aware dialogue intentions for external knowledge documents retrieval in task-oriented dialogues. To this end, we propose an intention-guided two-stage training approach that includes intention-guided training and knowledge transfer stages. This approach, which leverages rewritten utterances that explicitly convey entity-aware user intentions, can improve the performance of existing Bi-Encoder retrievers such as DPR (Deep Passage Retriever). In intention-guided training stage, a posterior history encoder is initialized and guided by inputting rewritten utterances for learning discriminative dense representations. In knowledge transfer stage, these representations are transferred to a newly initialized prior encoder for inference via an extra intent consistency loss. In addition, negative sampling in test knowledge documents is used to learn more discriminative dense representations of the unseen domain. The advantages of our approach are no need for response annotations and extra response generator, additionally, it provides great scalability. The experimental results on augmented MultiWOZ 2.1 dataset show that our approach outperforms baseline models except for relevance classifiers in retrieval accuracy and has reasonably high efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call