Abstract

Knowledge graph embedding, which aims to learn distributed representations of entities and relations, has been proven to be an effective method for predicting missing links in knowledge graphs. Existing knowledge graph embedding models mainly consider that the space where head and tail entities are located has the same properties. However head and tail entities can be different types of objects, and they should not be located in the vector space with the same properties. In this paper, we propose a novel knowledge graph embedding model called TimE, which presents each entry of the head (or tail) entity embeddings as a point in time domain space, and the corresponding entry of tail (or head) entity embeddings as a point in frequency domain space. Specifically, TimE defines each relation as a composition relation, which consists of a translation between entities and a diagonal projection matrix that projects the entities into the time domain space. In addition, we propose a cross-operation to model inverse and symmetric relations. Experimental results show that TimE not only outperforms existing state-of-the-art models on several large-scale benchmark datasets for the link prediction task, but also can better capture diversity distribution of entity embeddings for different relation patterns effectively, and can also model all relation patterns (including symmetry/antisymmetry, inversion, and composition).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.