Abstract

Representation learning over dynamic graphs has attracted much attention because of its wide applications. Recently, sequential probabilistic generative models have achieved impressive results because they can model data distributions. However, modeling the distribution of dynamic graphs is still extremely challenging. Existing methods usually ignore the mutual interference of stochastic states and deterministic states. Besides, the assumption that latent variables follow Gaussian distributions is usually inappropriate. To address these problems, we propose stochastic graph recurrent neural network (SGRNN), a sequential generative model for the representation learning over dynamic graphs. It separates stochastic states and deterministic states in the iterative process. To improve the flexibility of latent variables, we set the prior distribution and posterior distribution as semi-implicit distributions and propose DSI-SGRNN. In addition, to alleviate the KL-vanishing problem in SGRNN, a simple and interpretable structure is proposed based on the lower bound of KL-divergence. The proposed structure introduces a few extra parameters and can be implemented with a few lines of code modification. Extensive experiments on real-world datasets demonstrate the effectiveness of the proposed model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call