Abstract
Knowledge graph embedding (KGE), as a pivotal technology in artificial intelligence, plays a significant role in enhancing the logical reasoning and management efficiency of downstream tasks in knowledge graphs (KGs). It maps the intricate structure of a KG to a continuous vector space. Conventional KGE techniques primarily focus on representing static data within a KG. However, in the real world, facts frequently change over time, as exemplified by evolving social relationships and news events. The effective utilization of embedding technologies to represent KGs that integrate temporal data has gained significant scholarly interest. This paper comprehensively reviews the existing methods for learning KG representations that incorporate temporal data. It offers a highly intuitive perspective by categorizing temporal KGE (TKGE) methods into seven main classes based on dynamic evolution models and extensions of static KGE. The review covers various aspects of TKGE, including the background, problem definition, symbolic representation, training process, commonly used datasets, evaluation schemes, and relevant research. Furthermore, detailed descriptions of related embedding models are provided, followed by an introduction to typical downstream tasks in temporal KG scenarios. Finally, the paper concludes by summarizing the challenges faced in TKGE and outlining future research directions.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.