Abstract

This paper proposes a simple knowledge graph embedding (KGE) framework that considers the entity type information without additional resources. The KGE is used to obtain vector representations of entities and relations by learning structured information in triples. The obtained vectors are used to predict the missing links in a knowledge graph (KG). Although many KGs contain entity type information, most of the existing methods ignored the potential of the entity type information for the link prediction task. The proposed framework, which is called entity and entity type composition representation learning (EETCRL), obtains vector representations of both entities and entity types, which are combined and used for link prediction. Experimental results on three datasets show that the EETCRL outperforms the baseline methods in most cases. Furthermore, the results obtained from tests with different model sizes show that the proposed framework can achieve high performance even with a small model size. This paper also discusses the effect of considering information about entity types on the link prediction task by analyzing the experimental results.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call