Abstract
The goal of natural language inference (NLI) is to judge the logical relationship between sentence pairs, including entailment, contradiction, and neutral. At present, many researchers have shown that the introduction of external knowledge is helpful for NLI. However, the existing models are not effective enough for the utilization of knowledge information and not sufficient for the exploration of context information. In this paper, we propose a multi-branch network based on the synergy of knowledge and context. It applies two kinds of information synergy in the way that both knowledge information and context information promote performance improvement while modeling their separately in independent branches. In the context branch, we present a multi-level and dynamic assisted attention to construct sufficient interaction between sentence pairs. In the knowledge branch, we design a Knowledge-based Graph Attention Network (K-GAT) to capture the structural information of the knowledge, and the attention mechanism is used for knowledge interaction. In addition, for strengthening the relation between sentence pairs, we provide a relation branch to capture the context and knowledge relations of sentence pairs. In order to avoid introducing redundant external knowledge as much as possible, we adopt the introduction of five selected semantic dependencies based knowledge types. Experiments show that our model achieves strong competitive results on all three popular NLI datasets.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have