Abstract

Link prediction through continuous dynamic graph neural networks is a challenging endeavour. Previous studies have considered historic interaction sequences among pairs of nodes. However, this approach does not sufficiently model the links between them. Thus, the frequency of historical common neighbours was proposed to obtain more information about node pairs. Nonetheless, this measure fails to consider the timing of interactions because the relative importance of different interactions shifts over time.Multi-scale transformers with a continuous time dynamic graph model (MTdyg) are our response to this challenge to enhance link prediction. The novel MTdyg model employs a multi-scale patching strategy that dynamically adjusts based on the interaction frequency, feeding segmented interaction sequences into the transformer. This methodology significantly improves the model’s ability to assimilate historical data. Furthermore, we developed a temporal attention-based historical common neighbour encoding technique to effectively identify linkages between source and target nodes in interaction sequences. Extensive testing on eight public datasets confirmed that MTdyg delivers superior performance on most datasets, demonstrating its effectiveness in capturing the nuances of node interactions over time.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.