Abstract

Reasoning is essential for the development of large knowledge graphs, especially for completion, which aims to infer new triples based on existing ones. Both rules and embeddings can be used for knowledge graph reasoning and they have their own advantages and difficulties. Rule-based reasoning is accurate and explainable but rule learning with searching over the graph always suffers from efficiency due to huge search space. Embedding-based reasoning is more scalable and efficient as the reasoning is conducted via computation between embeddings, but it has difficulty learning good representations for sparse entities because a good embedding relies heavily on data richness. Based on this observation, in this paper we explore how embedding and rule learning can be combined together and complement each other's difficulties with their advantages. We propose a novel framework IterE iteratively learning embeddings and rules, in which rules are learned from embeddings with proper pruning strategy and embeddings are learned from existing triples and new triples inferred by rules. Evaluations on embedding qualities of IterE show that rules help improve the quality of sparse entity embeddings and their link prediction results. We also evaluate the efficiency of rule learning and quality of rules from IterE compared with AMIE+, showing that IterE is capable of generating high quality rules more efficiently. Experiments show that iteratively learning embeddings and rules benefit each other during learning and prediction.

Highlights

  • Many Knowledge Graphs (KGs), such as Freebase [2] and YAGO [33], have been built in recent years and led to a broad range of applications, including question answering [4], relation extraction [36], and recommender system [49]

  • Knowledge graph reasoning (KGR) can infer new knowledge based on existing ones and check knowledge consistency

  • We further identify a portfolio of ontology axioms for rule learning with embedding based on linear map assumption

Read more

Summary

INTRODUCTION

Many Knowledge Graphs (KGs), such as Freebase [2] and YAGO [33], have been built in recent years and led to a broad range of applications, including question answering [4], relation extraction [36], and recommender system [49]. Embedding-based reasoning is more efficient when there are a large number of relations or triples to reason over Rule learning methods such as AMIE[10] aim to learn deductive and interpretable inference rules. Deductive rules can infer additional triples for sparse entities and help embedding learning methods encode them better. The experiment results show that IterE achieves both better link prediction performance and high quality rule learning results These support our goal of making IterE complement the strengths of embedding and rule learning. We propose an iterative framework that combines embedding learning and rule learning to explore the mutual benefits between them Experiments show that it leads to better link prediction results using rules and embeddings together. Experiments show that IterE learns more high quality rules more efficiently than conventional rule learning systems

Knowledge graph embedding
(3) 3 METHOD
Embedding Learning
Axiom Induction
Axiom Injection
EXPERIMENT
Dataset
Training Details
Embedding Evaluation
Rule Evaluation
Iterative learning
Case study
Rule learning
CONCLUSION AND FUTURE WORK
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call