Abstract

Federated learning (FL) is a promising technique for collaboratively training machine-learning models on massively distributed clients data under privacy constraints. However, the existing FL literature focuses on speeding up the learning process and ignores minimizing the communication cost which is critical for resource-constrained clients. To this end, in this article, we propose a novel 3-way hierarchical framework (THF) to promote communication efficiency in FL. Using the proposed framework, only a cluster head (CH) communicates with the cloud server through edge aggregation in order to minimize the communication cost of clients. In particular, the clients upload their local models to their respective CHs, which are responsible to forward them to the corresponding edge server. The edge server averages the local models and iterates until it achieves the edge accuracy. Afterward, each edge server uploads the edge models to the cloud server for global aggregation. In this way, model downloading and uploading requires less bandwidth due to the short distance from source to destination that makes an efficient 3-way hierarchical network structure. In addition, we formulate a joint communication and computation resource management scheme through efficient client selection in order to achieve global cost minimization in FL. We conduct extensive empirical evaluations on diverse data learning tasks on multiple data sets to signify that THF achieves global cost savings and converges within fewer communication rounds compared to other FL approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call