Abstract

Predicting missing facts in a knowledge graph (KG) is a crucial task in knowledge base construction and reasoning, and it has been the subject of much research in recent works using KG embeddings. While existing KG embedding approaches mainly learn and predict facts within a single KG, a more plausible solution would benefit from the knowledge in multiple language-specific KGs, considering that different KGs have their own strengths and limitations on data quality and coverage. This is quite challenging, since the transfer of knowledge among multiple independently maintained KGs is often hindered by the insufficiency of alignment information and the inconsistency of described facts. In this paper, we propose KEnS, a novel framework for embedding learning and ensemble knowledge transfer across a number of language-specific KGs. KEnS embeds all KGs in a shared embedding space, where the association of entities is captured based on self-learning. Then, KEnS performs ensemble inference to combine prediction results from embeddings of multiple language-specific KGs, for which multiple ensemble techniques are investigated. Experiments on five real-world language-specific KGs show that KEnS consistently improves state-of-the-art methods on KG completion, via effectively identifying and leveraging complementary knowledge.

Highlights

  • Knowledge graphs (KGs) store structured representations of real-world entities and relations, constituting actionable knowledge that is crucial to various knowledge-driven applications (KoncelKedziorski et al, 2019; Chen et al, 2018a; Bordes et al, 2014)

  • We provide detailed case studies to interpret how a sparse, low-resource KG can benefit from embeddings of other KGs, and how exclusive knowledge in one KG can be broadcasted to others

  • We observe that KEnS brings larger gains on sparser KGs than on the well-populated ones

Read more

Summary

Introduction

Knowledge graphs (KGs) store structured representations of real-world entities and relations, constituting actionable knowledge that is crucial to various knowledge-driven applications (KoncelKedziorski et al, 2019; Chen et al, 2018a; Bordes et al, 2014). Extensive efforts have been invested in KG embedding models, which encode entities as low-dimensional vectors and capture relations as algebraic operations on entity vectors. These models provide a beneficial tool to complete KGs by discovering previously unknown knowledge from latent representations of observed facts. Representative models including translational models (Bordes et al, 2013; Wang et al, 2014) and bilinear models (Yang et al, 2015; Trouillon et al, 2016) have achieved satisfactory performance in predicting missing facts

Methods
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.