Abstract

Different from the existing graph disentanglement neural networks, we interpret the graph entanglement under a probabilistic generation framework in this paper. With this foundation, a Mixed Probabilistic Generation Model induced Graph Disentanglement Network (MPGD) is proposed. Considering the disentangled components corresponding to different factors as obeying specific distributions, a generalized probabilistic aggregation scheme among components is deduced theoretically. As a key part of the mixed probabilistic generative model, we provide a solution for estimating the mixture probabilities using self-attention and an in-depth analysis of its close connection with the classical EM parameter estimation method. Meanwhile, a way of probabilistic aggregation is formulated to obtain the node representation in embedding space. In addition, the prior mixture probabilities are formulated as an auxiliary factor-aware representation to facilitate the twin-branch prediction. A variety of experiments show that MPGD achieves more competitive performance than some existing state-of-the-art methods while having ideal disentangling effects. The code implementation is available in https://github.com/GiorgioPeng/MPGD.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call