Abstract

Nowadays, a large number of Knowledge Graph Completion (KGC) methods have been proposed by using embedding based manners, to overcome the incompleteness problem faced with knowledge graph (KG). One important recent innovation in Natural Language Processing (NLP) domain is the employ of deep neural models that make the most of pre-training, culminating in BERT, the most popular example of this line of approaches today. Recently, a series of new KGC methods introducing a pre-trained language model, such as KG-BERT, have been developed and released compelling performance. However, previous pre-training based KGC methods usually train the model by using simple training task and only utilize one-hop relational signals in KG, which leads that they cannot model high-order semantic contexts and multi-hop complex relatedness. To overcome this problem, this paper presents a novel pre-training framework for KGC task, which especially consists of both one-hop relation level task (low-order) and multi-hop meta-graph level task (high-order). Hence, the proposed method can capture not only the elaborate sub-graph structure but also the subtle semantic information on the given KG. The empirical results show the efficiency of the proposed method on the widely-used real-world datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call