Abstract

Federated Graph Learning (FGL) opens up new possibilities for machine learning in complex networks and distributed training, enabling multiple clients to collaborate on training Graph Neural Networks (GNNs) while preserving privacy. However, there may be differences between participating clients in terms of data distribution, computational power, and model architecture, which requires us to be able to mitigate the problem of heterogeneity across participating clients. Most of the existing model-heterogeneous Federated Learning frameworks are based on non-graph data, which is not sufficiently effective in FGL scenarios. Moreover we find that the model-heterogeneous Federated Learning approach based on prototype learning generates a single global prototype that does not provide good guidelines for each client in the case of multiple heterogeneous clients. Also, the problem of class imbalance of local data in FGL scenarios affects the overall utility. To this end, we propose FGPL, a model heterogeneous FGL framework based on prototype learning and local data augmentation. Specifically, on the one hand, locally, we mitigate the problem of local data imbalance by generating better local prototypes through a node generation method based on global prototypes and local structures. On the other hand, we cope with the domain bias problem by generating targeted global prototypes, and use a comparative learning approach to provide more personalized and correct guidelines for each client’s training. We demonstrate with extensive experiments that our approach can greatly improve the prediction of FGL. The code associated with this study is available at https://github.com/1447884149/FGPL.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call