Abstract

Temporal link prediction based on graph neural networks has become a hot spot in the field of complex networks. To solve the problems of the existing temporal link prediction methods based on graph neural networks do not consider the future time-domain features and spatial-domain features are limited used, this paper proposes a novel temporal link prediction method based on two streams adaptive graph neural networks. Firstly, the network topology features are extracted from the micro, meso, and middle perspectives. Combined with the adaptive mechanism of convolution and self-attention, the preprocessing of the feature extraction is more effective; Secondly, an extended bi-directional long short-term memory network is proposed, which uses graph convolution to process topological features, and recursively learns the state vectors of the target snapshot by using the future time-domain information and the past historical information; Thirdly, the location coding is replaced by the time-coding for the transformer mechanism, so that past information and future information can be learned from each other, and the time-domain information of the network can be further mined; Finally, a novel two-stream network framework is proposed, which combines the processing results of point features and edge features. The experimental results on 9 data sets show that the proposed method has a better prediction effect and better robustness than the classical graph neural network methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.