Abstract

Combining first order logic rules with a Knowledge Graph (KG) embedding model has recently gained increasing attention, as rules introduce rich background information. Among such studies, models equipped with soft rules, which are extracted with certain confidences, achieve state-of-the-art performance. However, the existing methods either cannot support the transitivity and composition rules or take soft rules as regularization terms to constrain derived facts, which is incapable of encoding the logical background knowledge about facts contained in soft rules. In addition, previous works performed one time logical inference over rules to generate valid groundings for modeling rules, ignoring forward chaining inference, which can further generate more valid groundings to better model rules. To these ends, this paper proposes Soft Logical rules enhanced Embedding (SoLE), a novel KG embedding model equipped with a joint training algorithm over soft rules and KG facts to inject the logical background knowledge of rules into embeddings, as well as forward chaining inference over rules. Evaluations on Freebase and DBpedia show that SoLE not only achieves improvements of 11.6%/5.9% in Mean Reciprocal Rank (MRR) and 18.4%/15.9% in HITS@1 compared to the model on which SoLE is based, but also significantly and consistently outperforms the state-of-the-art baselines in the link prediction task.

Highlights

  • A Knowledge Graph (KG), known as a knowledge base, is a structured representation of knowledge about our world

  • The results demonstrated that the devised joint algorithm for soft rules and the introduction of forward chaining improved KG embeddings, and our method was superior to the baseline methods

  • This paper proposed Soft Logical Rules enhanced Embedding (SoLE), a novel paradigm of KG

Read more

Summary

Introduction

A Knowledge Graph (KG), known as a knowledge base, is a structured representation of knowledge about our world. A KG is a collection of triples or facts, composed of entities that represent real-world objects and relations that express the relationships between entities, in the form of (head entity, relation, tail entity) abbreviated as (h, r, t). These triples can be formalized as a directed multi-relational graph, where nodes denote entities and one edge directed from node h to t indicates the relation r between them. The main idea of KG embedding is to embed entities and relations into a low-dimensional continuous vector space Such vectorial representations of KG can further benefit a wide variety of downstream tasks such as KG completion [9], entity resolution [10], and relation extraction [11]

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call