Abstract

This paper proposes a novel model for learning the embedding representation of the given first-order logic query and then leading the task of Question Answering (QA) on knowledge graph, especially handling the complex logical queries and mining the multi-hop paths contained in the knowledge graph. In the proposed model, the node (i.e., entity) embeddings and the query embeddings are trained jointly in the same latent semantic feature-space, so that we could measure the matching degree among query (i.e., question) and candidate entities (i.e., answers) by using the semantic distances among them. Recent years have witnessed great advance of QA models based on knowledge graph, however, the main difference between these previous work and ours, is that traditional QA focuses on understanding the natural language, while we are more attentive to detail in modeling and understanding the logical form. The experiments on several knowledge graph reasoning tasks with the real-world datasets demonstrate that, the proposed logical query learning model is more effective than the state-of-the-art models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call