Abstract

The pre-training of the graph neural network model is to learn the general characteristics of large-scale graphs or the similar type of graphs usually through a self-supervised method, which allows the model to work even when node labels are missing. However, the existing pre-training methods do not take the temporal information of edge generation and the evolution process of graph into consideration. To address this issue, this paper proposes a pre-training method based on dynamic graph neural networks (PT-DGNN), which uses the dynamic graph generation task to simultaneously learn the structure, semantics, and evolution features of the graph. The method mainly includes two steps: 1) dynamic subgraph sampling, and 2) pre-training using two graph generation tasks. The former preserves the local time-aware structure of the original graph by sampling the latest and frequently interacting nodes. The latter uses observed edges to predict unobserved edges to capture the evolutionary characteristics of the network. Comparative experiments on three realistic dynamic network datasets show that the proposed pre-training method achieves the best results on the link prediction fine-tuning task and the ablation study and further verifies the effectiveness of the above two steps.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.