Abstract

Federated Learning (FL) is an emerging machine learning technique used to train big data in resource-constrained situations. However, three main challenging issues are identified regarding communication resources in FL. First, the parameter server (PS) that collects user devices’ models is in the remote cloud. In the process of aggregating the model, the issues may burden the path links between the PS and high-traffic local nodes. Second, the network can be congested owing to the large size of the model parameters. Third, PS-side links may be highly stressed if the number of participating clients is large. In the present study, we propose a resource efficient FL scheme, in which clusters’ clients are based on the location, and the communication range of each client, which selects partial clients in a cluster, updates the model by exploiting the Pareto principle. Simulation results show that our proposed scheme reduces wireless network traffics while maintaining a slightly higher accuracy than the legacy FL mechanism.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call