Abstract

Graph Neural Networks (GNNs) have demonstrated state-of-the-art performance in a wide variety of analytical tasks. Current GNN approaches focus on learning representations in a Euclidean space, which are effective in capturing non-tree-like structural relations, but they fail to model complex relations in many real-world graphs, such as tree-like hierarchical graph structure. This paper instead proposes to learn representations in both Euclidean and hyperbolic spaces to model these two types of graph geometries. To this end, we introduce a novel approach – Joint hyperbolic and Euclidean geometry contrastive graph neural networks (JointGMC). JointGMC is enforced to learn multiple layer-wise optimal combinations of Euclidean and hyperbolic geometries to effectively encode diverse complex graph structures. Further, the performance of most GNNs relies heavily on the availability of large-scale manually labeled data. To mitigate this issue, JointGMC exploits proximity-based self-supervised information in different geometric spaces (i.e., Euclidean, hyperbolic, and Euclidean-hyperbolic interaction spaces) to regularize the (semi-) supervised graph learning. Extensive experimental results on eight real-world graph datasets show that JointGMC outperforms eight state-of-the-art GNN models in diverse graph mining tasks, including node classification, link prediction, and node clustering tasks, demonstrating JointGMC’s superior graph representation ability.Code is available at https://github.com/chachatang/jointgmc.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call