Abstract

With increasing research interests in dialogue modeling, there is an emerging branch that formulates this task as next sentence selection, where given the partial dialogue context, the goal is to determine the most probable next sentence. To model natural language information, recurrent models have been applied to sequence modeling and shown promising results in various NLP tasks (Sutskever et al., 2014). Recently, the Transformer (Vaswani et al., 2017) has advanced modeling semantics for natural language sentences via attention, achieving improvement for sequence modeling. However, the Transformer focuses on modeling the intra-sentence attention but ignores inter-sentence information. In terms of dialogue modeling, the cross-sentence information is salient to understand dialogue content, so that the response selection can be better determined. Therefore, this paper proposes a novel attention mechanism based on multi-head attention, called highway attention, in order to allow the model to pass information through multiple sentences, and then builds a recurrent model based on the Transformer and the proposed highway attention. We call this model Highway Recurrent Transformer. This model focuses on not only intra-sentence dependency, but also inter-sentence dependency in the structure of dialogues. Experiments on the response selection task of the seventh Dialog System Technology Challenge (DSTC7) demonstrate that the proposed Highway Recurrent Transformer is capable of modeling both utterance-level and dialogue-level information for achieving better performance than the original Transformer in the single positive response scenario.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.