Abstract

Knowledge graph embedding is studied to embed entities and relations of a knowledge graph into continuous vector spaces, which benefits a variety of real-world applications. Among existing solutions, translational models, which employ geometric translation to design score function, have drawn much attention. However, these models primarily concentrate on evidence from observing whether the triplets are plausible, and ignore the fact that the relation also implies certain semantic constraints on its subject or object entity. In this paper, we present a general framework for enhancing knowledge graph embedding with relational constraints (KRC). Specifically, we elaborately design the score function by encoding regularities between a relation and its arguments into the translational embedding space. Additionally, we propose a soft margin-based ranking loss for effectively training the KRC model, which characterizes different semantic distances between negative and positive triplets. Furthermore, we combine regularities with distributional representations to predict the missing triplets, which can possess certain robust guarantee. We evaluate our method on the tasks of knowledge graph completion and entity classification. Extensive experiments show that KRC achieves a better, or comparable performance against state-of-the-art methods. Besides, KRC makes a great improvement when dealing with long-tail entities, which have few instances in the knowledge graph.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call