Abstract
Graph data are ubiquitous in the real world. Graph learning (GL) attempts to mine and analyze graph data so that valuable information can be discovered. Existing GL methods are designed for centralized scenarios. However, in practical scenarios, graph data are usually distributed across different organizations, resulting in isolated data silos. To address this problem, we incorporate federated learning into GL and propose a general Federated Graph Learning framework called FedGL. FedGL is capable of obtaining a high-quality global graph model while protecting data privacy by discovering the global self-supervision information during the federated training. Specifically, we propose uploading prediction results and node embeddings to the server for discovering the global pseudo labels and global pseudo graph. These are then distributed to each client to enrich the training labels and complement the graph structure respectively, thereby improving the quality of each local model. Moreover, the global self-supervision enables information of each client to flow and be shared in a privacy-preserving manner, thus alleviating the heterogeneity and utilizing the complementarity of graph data among different clients. Finally, experimental results show that FedGL significantly outperforms baselines on four widely used graph datasets.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.