Abstract

Graph embedding aims to preserve graphs into low-dimensional embedding space while preserving their properties. As thus, the embeddings can be easily exploited by downstream graph-based tasks. Most existing models of graph embedding strive to preserve some proximity properties of graphs such as the first/second-order proximity or higher-order proximity within a random walk sequence. However, the proximity relationship between different sequences has been rarely discussed, which is a significant property of graphs. For this reason, we propose a graph dilated recurrent neural network (G-DRNN) model to learn the inter-sequence proximity. Considering the inter-sequence proximity exists within subgraphs, we can also define it as subgraph-aware higher-order proximity. Especially, we note that a graph sequence consists of alternating nodes and edges and can be decomposed into a node subsequence and an edge sequence. To explore the special structure of a graph sequence, the G-DRNN exploits a skip connection network architecture to preserve the overall information of the graph sequence while maintaining the relative independence of each subsequence. To evaluate the effectiveness of the G-DRNN in multi-relational graphs, we empirically apply it to the popular research interest of knowledge graph embedding. The results demonstrate that the embeddings learned by the G-DRNN outperform previous approaches on both tasks of link prediction and node classification.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call