Abstract

As the basis of various downstream applications such as automated question answering and knowledge reasoning, Temporal Knowledge Graph (TKG) embedding learning has attracted many interests. Recent works have explored several methods to learn dynamic embedding for TKGs and led to success in many applications, however, they are difficult to be optimized. In order to accurately catch semantic information propagation of entity-to-entity and entity-to-relation, as well as track dynamics of entities and relations in continuous time, we proposed a novel TKG embedding model ODETKGE. Instead of using recurrent neural network to model dynamics of semantic representations in continuous time, we leverage neural ordinary differential equations to model the dynamic evolution, which enable us to capture long-range dependencies between embeddings at different timestamps, track the dynamics of TKGs with irregular intervals and remain computationally efficient. Furthermore, to effectively model semantics of knowledge graph structure, we combine entity-relation collaboratively updated graph convolutional networks and neural ordinary differential equations. Additionally, to acquire important semantics such as creation and disappearance of relations, we utilize a graph transition layer. Comparing with RNN based models, the proposed ODETKGE can use fewer parameters to achieve deeper layers, and make the transformation of representations between two timestamps smoother. Extensive experiments have been executed on real-world datasets and the results verify that ODETKGE is superior than the state-of-the-art models in link forecasting, long-interval link forecasting, irregular-interval link forecasting, ablation study and training time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call