Abstract

Research on link prediction in knowledge graphs has mainly focused on static multi-relational data. In this work we consider temporal knowledge graphs where relations between entities may only hold for a time interval or a specific point in time. In line with previous work on static knowledge graphs, we propose to address this problem by learning latent entity and relation type representations. To incorporate temporal information, we utilize recurrent neural networks to learn time-aware representations of relation types which can be used in conjunction with existing latent factorization methods. The proposed approach is shown to be robust to common challenges in real-world KGs: the sparsity and heterogeneity of temporal expressions. Experiments show the benefits of our approach on four temporal KGs. The data sets are available under a permissive BSD-3 license.

Highlights

  • Knowledge graphs (KGs) are used to organize, manage, and retrieve structured information

  • A KG is of the form G = (E, R), where E is a set of entities and, R is a set of relation types or predicates

  • We focus on temporal KGs where some triples are augmented with time information and the link prediction problem asks for the most probable completion given time information

Read more

Summary

Introduction

Knowledge graphs (KGs) are used to organize, manage, and retrieve structured information. We focus on temporal KGs where some triples are augmented with time information and the link prediction problem asks for the most probable completion given time information. Most approaches to link prediction are characterized by a scoring function that operates on the entity and relation type embeddings of a triple (Bordes et al, 2013; Yang et al, 2014; Guu et al, 2015). Learning representations that carry temporal information is challenging due to the sparsity and irregularities of time expressions. Character-level architectures for language modeling (Zhang et al, 2015; Kim et al, 2016) operate on characters as atomic units to derive word embeddings Inspired by these models, we propose a method to incorporate time information into standard embedding approaches for link prediction. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4816–4821 Brussels, Belgium, October 31 - November 4, 2018. c 2018 Association for Computational Linguistics

Related Work
Time-Aware Representations
LSTMs for Time-Encoding Sequences
Datasets
General Set-up
Results
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call