Abstract

Federated Learning is designed for multiple mobile devices to collaboratively train an artificial intelligence model while preserving data privacy. Instead of collecting the raw training data from mobile devices to the cloud, Federated Learning coordinates a group of devices to train a shared model in a distributed manner with the training data located on the devices. However, unbalanced data distribution and heterogeneous hardware configurations across different devices badly hurts the performance of collaborative model and severely impacts the overall training progress. Thus, a framework that can well balance the model accuracy and the training progress is urgently required.In this paper, we propose PAGroup, a privacy-aware grouping framework for high-performance Federated Learning. PAGroup intelligently divides the participating clients into different groups through carefully analyzing the privacy requirement of the training data and the solid social relationship of the participating clients. After that, PAGroup conducts data shaping and capability-ware training in order to improve the model performance while accelerating the overall training process. We evaluate the performance of PAGroup with both simulation and hardware testbed. The evaluation results show that PAGroup improves model accuracy up to 21%. Meanwhile, it decreases 81% communication overhead, 29% computation cost and 84% wall-clock time at best comparing with the baselines.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call