ABSTRACTRecently, significant progress has been made in completing static knowledge graphs. However, knowledge tends to evolve with time, and static knowledge graph completion (KGC) methods struggle to capture the changes. Therefore, temporal knowledge graph (TKG) reasoning has become a focus of research. Most existing TKG methods incorporate temporal information into triplets and transform them into KGC tasks, ignoring the important influence of time information and implicit relationships between entities. In this paper, we propose a new method called TD‐RKG, which addresses the challenges of temporal variability and implicit entity correlations based on a dynamic fusion representation learning approach. The method consists of four modules: dynamic local recurrent encoding layer, dynamic implicit encoding layer, dynamic global information attention layer and decoding layer. Experimental results on three benchmark datasets demonstrate substantial improvements in TD‐RKG across multiple evaluation metrics.
Read full abstract