Abstract

Knowledge representation learning (KRL) aims at embedding the entities and relations in knowledge graphs (KGs) into vectors by learning. In recent years, several multi-task learning (MTL) frameworks were proposed to KRL and achieved promising results. Nevertheless, most of these models only share the initial embeddings of entities and relations among sub-tasks. We construct hard negative samples for relation prediction (RP) task and triple classification (TC) task by using the similarity matrix from the entity prediction (EP) task. We also share the features in hidden layers of EP task to TC. The proposed framework is applicable to CNN based models for KRL. Experimental results on WN18RR and FB15k-237 datasets show that the proposed framework promotes the performance of EP, RP and TC tasks evidently comparing with the single task models and obtains better results in RP and TC than the state-of-the-art MTL-based models/frameworks. Code of NS-KRL is available online.11https://github.com/fofilix/NS_KRL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call