Abstract

As artificial intelligence gradually steps into cognitive intelligence stage, knowledge graphs (KGs) play an increasingly important role in many natural language processing tasks. Due to the prevalence of long-tail relations in KGs, few-shot knowledge graph completion (KGC) for link prediction of long-tail relations has gradually become a hot research topic. Current few-shot KGC methods mainly focus on the static representation of surrounding entities to explore the potential semantic features of entities, while ignoring the dynamic properties among entities and the special influence of the long-tail relation on link prediction. In this paper, a new meta-learning based dynamic adaptive relation learning model (DARL) is proposed for few-shot KGC. For obtaining better semantic information of the meta knowledge, the proposed DARL model applies a dynamic neighbor encoder to incorporate neighbor relations into entity embedding. In addition, DARL builds attention mechanism based fusion strategy for different attributes of the same relation to further enhance the relation-meta learning ability. We evaluate our DARL model on two public benchmark datasets NELL-One and WIKI-One for link prediction. Extensive experimental results indicate that our DARL outperforms the state-of-the-art models with an average relative improvement about 23.37%, 32.46% in MRR and [email protected] on NELL-One, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call