Abstract

Recently, the research about knowledge graphs (KGs) which contain a large number of triples, has gained massive attention. Many knowledge graph embedding methods are proposed to tackle the knowledge graph completion task. In real applications, knowledge graphs are applied not only in a centralized way but also in a decentralized manner. We study the problem of learning knowledge graph embeddings for a set of federated knowledge graphs, where their raw triples are not allowed to be collected together. We propose a federated learning framework FedEC. In our framework, a local training procedure is responsible for learning knowledge graph embeddings on each client based on a specific embedding learner. We apply embedding-contrastive learning to limit the embedding update for tackling data heterogeneity. Moreover, a global update procedure is used for sharing and averaging entity embeddings on the master server. Furthermore, we design embedding ensemble procedures to take full advantage of knowledge learned from different aspects. Finally, we conduct extensive experiments on datasets derived from KGE benchmark datasets, and the results show the effectiveness of our proposed model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call