Abstract

The distributed representation of Knowledge graphs (KGs) aims to embed the original KG into a low-dimensional embedding vector space, so as to facilitate the completion of KGs as well as the application of KGs in other AI fields. Most existing models preserve certain proximity property of KGs in the embedding space, such as the first/second-order proximity and the sequence-aware higher-order proximity. However, the ubiquitous similarity relationship between different sequences has rarely been discussed. In this paper, we propose a unified framework to preserve the subgraph-aware proximity in the embedding space, holding that the sequences within a subgraph generally imply a similar pattern. Especially, according to the composition and structure of KG sequences, we provide three methods for computing the embeddings of KG sequences: 1) Simply adding the involved relations of the KG sequences in a relation subgraph; 2) Recurrent neural network for the KG sequences in a complete subgraph; 3) Dilated recurrent neural network to match the special structure of the KG sequences in a complete subgraph. Empirically, we evaluate the proposed framework on the KG completion tasks of link prediction and entity classification. The results show that our framework outperforms the baselines by preserving the subgraph-aware proximity. Especially, exploring the special structure of KG sequences can further improve the performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call