Abstract

AbstractContrastive self‐supervised representation learning on attributed graph networks with Graph Neural Networks has attracted considerable research interest recently. However, there are still two challenges. First, most of the real‐word system are multiple relations, where entities are linked by different types of relations, and each relation is a view of the graph network. Second, the rich multi‐scale information (structure‐level and feature‐level) of the graph network can be seen as self‐supervised signals, which are not fully exploited. A novel contrastive self‐supervised representation learning framework on attributed multiplex graph networks with multi‐scale (named CoLM 2 S) information is presented in this study. It mainly contains two components: intra‐relation contrast learning and inter‐relation contrastive learning. Specifically, the contrastive self‐supervised representation learning framework on attributed single‐layer graph networks with multi‐scale information (CoLMS) framework with the graph convolutional network as encoder to capture the intra‐relation information with multi‐scale structure‐level and feature‐level self‐supervised signals is introduced first. The structure‐level information includes the edge structure and sub‐graph structure, and the feature‐level information represents the output of different graph convolutional layer. Second, according to the consensus assumption among inter‐relations, the CoLM2S framework is proposed to jointly learn various graph relations in attributed multiplex graph network to achieve global consensus node embedding. The proposed method can fully distil the graph information. Extensive experiments on unsupervised node clustering and graph visualisation tasks demonstrate the effectiveness of our methods, and it outperforms existing competitive baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call