Abstract

Graph neural networks (GNNs) have recently become increasingly popular due to their ability to learn node representations in complex graphs. Existing graph representation learning methods mainly target static graphs in Euclidean space, whereas many graphs in practical applications are dynamic and evolve continuously over time. Recent work has demonstrated that real-world graphs exhibit hierarchical properties. Unfortunately, many methods typically do not account for these latent hierarchical structures. In this work, we propose a dynamic network in hyperbolic space via self-attention, referred to as DynHAT, which leverages both the hyperbolic geometry and attention mechanism to learn node representations. More specifically, DynHAT captures hierarchical information by mapping the structural graph onto hyperbolic space, and time-varying dynamic evolution by flexibly weighting historical representations. Through extensive experiments on three real-world datasets, we show the superiority of our model in embedding dynamic graphs in hyperbolic space and competing methods in a link prediction task. In addition, our results show that embedding dynamic graphs in hyperbolic space has competitive performance when necessitating low dimensions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call