Abstract

Temporal knowledge graphs (TKGs) have been widely used in various fields, and predicting missing knowledge graph inference has been widely explored. However, the task of reasoning about potential future facts on TKGs is more challenging and has attracted the attention of researchers. As the unknowability of future events complicates inference, a thorough study of the characteristics of historical facts becomes crucial. The study of the concurrency of historical events and the underlying common patterns of relation-ships facilitates reasoning about future facts. In this paper, we propose a novel representation learning model based on Short-Term Sequential Patterns for TKG reasoning, namely STSP. By modeling TKG sequences recurrently and learning representations of entities and relations. Specifically, the STSP encoder uses three main modules. Concurrent facts for each timestamp are modeled using a convolution-based relation-aware GCN. The entity-aware attention module is used to integrate the entity representation of the current timestamp and the previous timestamp. The sliding window mechanism is used to learn different relations sequentially. The entity and relation representations are then handed over to a translation-based decoder for final reasoning. We use four benchmark datasets to evaluate the proposed approach. The experimental results show that STSP out-performs state-of-the-art TKG reasoning methods and obtains substantial performance.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call