Abstract

This research delves into the consequences of the high complexity of on-device operations executed during the federated learning process. We investigate how the varying computational capabilities and battery levels among mobile devices can introduce performance disparities and influence training quality. Hence, in order to deal with these challenges, we propose EAFL+, a novel energy optimization technique, that focuses on managing power consumption in devices with limited battery capacity. EAFL+ is a cloud–edge–terminal collaborative approach that provides a new architectural design for achieving power-aware FL training by leveraging resource diversity and computation offloading. The innovative scheme enables the efficient selection of an approximately-optimal offloading target, from a set of Cloud-tier, Edge-tier, and Terminal-tier resources and achieves the best cost-quality tradeoff for the devices taking part in the FL system. Our evaluation shows EAFL+ can help conserve the devices’ energy participating in training, which improves the participation rates and increases the clients’ contributions, hence achieving higher accuracy and faster convergence. Through experiments on real datasets and traces in an emulated FL environment, EAFL+ reduces the drop-out of clients to zero and enhances accuracy by up to 24% and 9% compared to EAFL and Oort, respectively.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call