Abstract

Knowledge graphs (KGs) are of great importance to many artificial intelligence applications, but they usually suffer from the incomplete problem. Knowledge graph embedding (KGE), which aims to represent entities and relations in low-dimensional continuous vector spaces, has been proved to be a promising approach for KG completion. Traditional KGE methods only concentrate on structured triples, while paying less attention to the type information of entities. In fact, incorporating entity types into embedding learning could further improve the performance of KG completion. To this end, we propose a universal Type-augmented Knowledge graph Embedding framework (TaKE) which could utilize type features to enhance any traditional KGE models. TaKE automatically captures type features under no explicit type information supervision. And by learning different type representations of each entity, TaKE could distinguish the diversity of types specific to distinct relations. We also design a new type-constrained negative sampling strategy to construct more effective negative samples for the training process. Extensive experiments on four datasets from three real-world KGs (Freebase, WordNet and YAGO) demonstrate the merits of our proposed framework. In particular, combining TaKE with the recent tensor factorization KGE model SimplE can achieve state-of-the-art performance on the KG completion task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call