Abstract

Knowledge Graph Completion (KGC) aims to fill the missing facts in Knowledge Graphs (KGs). Due to the most real-world KGs evolve quickly with new entities and relations being added by the minute, the dynamic KGC task is more practical than static KGC task because it can be easily to scale up the KGs by add new entities and relations. Most existing dynamic KGC models are ignore the dependency between multi-source information and topology-structure so that they lose very much semantic information in KGs. In this paper, we proposed a novel dynamic KGC model with jointly structural and textual dependency based on deep recurrent neural network (DKGC-JSTD). This model learns embedding of entity’s name and parts of its text-description to connect unseen entities to KGs. In order to establish the relevance between text description information and topology information, DKGC-JSTD uses deep memory network and association matching mechanism to extract relevant semantic feature information between entity and relations from entity text-description. And then using deep recurrent neural network to model the dependency between topology-structure and text-description. Experiments on large data sets, both old and new, show that DKGC-JSTD performs well in the dynamic KGC task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call