Abstract

Embedding knowledge graph with graph attention network has become a novel research topic in the field of knowledge graph completion. However, the current graph attention network generates the same embeddings for different structures and different entities when generating entity embeddings for knowledge graph. The quality of embedding directly contributes the effective of completion. We analyze the reason why graph attention network cannot distinguish structure, because the aggregation based on attention GNN ignores the cardinality information, which is the mapping of diverse features, and helps to distinguish the contributions of different nodes in the domain. Therefore, we propose the graph attention preserving (KBCPA) model. Cardinality information is added into the attentional mechanism based aggregation to generate different representations for different entities, thus improving the discrimination ability of the model. Our experiments present that our model is effective and competitive, and it obtains better performance than previous state-of-the-art embedding models for knowledge graph completion on two benchmark datasets WN18RR and FB15k-237.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call