Abstract

Graph neural networks (GNNs) have achieved considerable success in dealing with graph-structured data by the message-passing mechanism. Actually, this mechanism relies on a fundamental assumption that the graph structure along which information propagates is perfect. However, the real-world graphs are inevitably incomplete or noisy, which violates the assumption, thus resulting in limited performance. Therefore, optimizing graph structure for GNNs is indispensable and important. Although current semi-supervised graph structure learning (GSL) methods have achieved a promising performance, the potential of labels and prior graph structure has not been fully exploited yet. Inspired by this, we examine GSL with dual reinforcement of label and prior structure in this article. Specifically, to enhance label utilization, we first propose to construct the prior label-constrained matrices to refine the graph structure by identifying label consistency. Second, to adequately leverage the prior structure to guide GSL, we develop spectral contrastive learning that extracts global properties embedded in the prior graph structure. Moreover, contrastive fusion with prior spatial structure is further adopted, which promotes the learned structure to integrate local spatial information from the prior graph. To extensively evaluate our proposal, we perform sufficient experiments on seven benchmark datasets, where experimental results confirm the effectiveness of our method and the rationality of the learned structure from various aspects.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call