Abstract

Federated learning (FL) has emerged as a pivotal paradigm for training machine learning models across decentralized devices while maintaining data privacy. In the healthcare domain, FL enables collaborative training among diverse medical devices and institutions, enhancing model robustness and generalizability without compromising patient privacy. In this paper, we propose DPS-GAT, a novel approach integrating graph attention networks (GATs) with differentially private client selection and resource allocation strategies in FL. Our methodology addresses the challenges of data heterogeneity and limited communication resources inherent in medical applications. By employing graph neural networks (GNNs), we effectively capture the relational structures among clients, optimizing the selection process and ensuring efficient resource distribution. Differential privacy mechanisms are incorporated, to safeguard sensitive information throughout the training process. Our extensive experiments, based on the Regensburg pediatric appendicitis open dataset, demonstrated the superiority of our approach, in terms of model accuracy, privacy preservation, and resource efficiency, compared to traditional FL methods. The ability of DPS-GAT to maintain a high and stable number of client selections across various rounds and differential privacy budgets has significant practical implications, indicating that FL systems can achieve strong privacy guarantees without compromising client engagement and model performance. This balance is essential for real-world applications where both privacy and performance are paramount. This study suggests a promising direction for more secure and efficient FL medical applications, which could improve patient care through enhanced predictive models and collaborative data utilization.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.