Abstract

As dynamic graphs have become indispensable in numerous fields due to their capacity to represent evolving relationships over time, there has been a concomitant increase in the development of Temporal Graph Neural Networks (TGNNs). When training TGNNs for dynamic graph link prediction, the commonly used negative sampling method often produces starkly contrasting samples, which can lead the model to overfit these pronounced differences and compromise its ability to generalize effectively to new data. To address this challenge, we introduce an innovative negative sampling approach named Enhanced Negative Sampling (ENS). This strategy takes into account two pervasive traits observed in dynamic graphs: (1) Historical dependence, indicating that nodes frequently reestablish connections they held in the past, and (2) Temporal proximity preference, which posits that nodes are more inclined to connect with those they have recently interacted with. Specifically, our technique employs a designed scheduling function to strategically control the progression of difficulty of the negative samples throughout the training. This ensures that the training progresses in a balanced manner, becoming incrementally challenging, and thereby enhancing TGNNs’ proficiency in predicting links within dynamic graphs. In our empirical evaluation across multiple datasets, we discerned that our ENS, when integrated as a modular component, notably augments the performance of four SOTA baselines. Additionally, we further investigated the applicability of ENS in handling dynamic graphs of varied attributes. Our code is available at https://github.com/qqaazxddrr/ENS.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call