Abstract

Knowledge Graph (KG) usually contains billions of facts about the real world, where a fact is represented as a triplet in the form of (head entity, relation, tail entity). KG is a complex network and consists of numerous nodes (entities) and edges (relations). Given that most KGs are noisy and far from being complete, KG analysis and completion methods are becoming more and more important. Knowledge graph embedding (KGE) aims to embed entities and relations in a low dimensional and continuous vector space, which is proven to be a quite efficient and effective method in knowledge graph completion tasks. KGE models devise various kinds of score functions to evaluate each fact in KG, which assign high points for true facts and low points for invalid ones. In a KG of the real world, some nodes may have hundreds of links with other nodes. There is a wealth of information around an entity, and the surrounding information (i.e., the sub-graph structure information) of one entity can make a significant contribution to predicting new facts. However, many previous works including, translational approaches such as Trans(E, H, R, and D), factorization approaches such as DistMult, ComplEx, and other deep learning approaches such as NTN, ConvE, concentrate on rating each fact in an isolated and separated way and lack a specially designed mechanism to learn the sub-graph structure information of the entity in KG. To conquer this challenge, we leverage the information fusion mechanism (Graph2Seq) used in graph neural network which is specially designed for graph-structured data, to learn fusion embeddings for entities in KG. And a novel fusion embedding learning KGE model (referred as G2SKGE) which aims to learn the sub-graph structure information of the entity in KG is proposed. With empirical experiments on four benchmark datasets, our proposed model achieves promising results and outperforms the state-of-the-art models.

Highlights

  • Knowledge Graph (KG) are usually large in scale, they are very noisy

  • We extend our G2SKGE model with attention mechanism referred as G2SKGEatt, which enables the model to focus on more important links when making prediction

  • The following conclusions can be made: 1) our proposed G2SKGE model outperforms all the baseline models on FB15k, which demonstrates the effectiveness of learning the sub-graph structure information of the entity in KG for knowledge graph embedding (KGE) models; 2) The improvement is much more significant compared with graph neural network model R-GCN on both datasets. This is attributed to the fact that our model is designed to learn fusion embedding of the entity by capturing the incoming and outgoing links around it, while R-GCN only gathers the node information around the target entity; 3) G2SKGEatt gains better results than G2SKGE, which further illustrate that the attention mechanism can be

Read more

Summary

Introduction

KGs are usually large in scale (containing millions of entities and billions of triplets), they are very noisy. Knowledge graph embedding (KGE) is proven to be a highly scalable, efficient and effective method to deal with KGC tasks ( known as link prediction tasks) compared with traditional symbolic models or logical inference systems [20], [21]. KGE focuses on how to learn appropriate numeric embedding vectors or matrices to represent entities and relationships in KG. These numeric representations of entities and relations are called entity embedings and relation embeddings (referred as general embeddings), which are meant to preserve the meaningful information of the entities and relations in KG. TransE [22] is a cornerstone model in the embedding based approaches, which projects entities and relations in a low dimension space. A large number of TransE-like approaches have emerged, such as TransH [24], TransR [25], TransD [26], TransG [27], TransPES [21]

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call