Abstract

Representation learning of knowledge graphs is the task of representing entities and relations of real-world knowledge as low-dimensional vectors. Some previous works have demonstrated that entity types are beneficial for knowledge representation, but these methods rely too much on prior knowledge or do not have pluggability. To address these limitations, we propose a new method named TRPE which can better explore the entity type information for knowledge graph embedding. We suggest that entities should have multiple representations and each entity type should play a different role in different relations. To better characterize the interaction between entity types and relations, we build the type-relation pairs for each entity and gain each type-relation pair feature through the interaction between type-specific features and relation-specific features. These type-relation pair features are assigned different weights which are aggregated to serve as the transformation of entity embeddings. After that, we divide each entity embedding into some sub-embeddings where each sub-embedding does its own transformation. At the same time, our model is a pluggable module that can be attached to other models. We integrate our model into three baseline models for evaluation on FB15k and FB15k + datasets. The experiment results show that our model brings significant improvement to baseline models. On CoDEx datasets with more noisy entity type data, our model still achieves good results compared with other models, which also demonstrates the generalization ability of our model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call