Abstract
Contemporary applications leverage machine learning models to optimize performance, often necessitating data transmission to a remote server for training. However, this approach entails significant resource consumption. A privacy concern arises, which Federated Learning addresses through a cyclical process involving in-device training (local model update) and subsequent reporting to the server for aggregation (global model update). In each iteration of this cycle, termed a communication round, a client selection component determines participant devices contributing to global model enhancement. However, existing literature inadequately addresses scenarios where optimized energy consumption is imperative. This paper introduces an Energy Saving Client Selection (ESCS) mechanism, considering decision criteria such as battery level, training time capacity, and network quality. As a pertinent use case, classification scenarios are utilized to compare the performance of ESCS against other state-of-the-art approaches. The findings reveal that ESCS effectively conserves energy while maintaining optimal performance. This research contributes to the ongoing discourse on energy-efficient client selection strategies within the domain of Federated Learning.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have