Abstract

In Federated Learning (FL), hyper-parameters significantly affect the training overhead in terms of computation time, transmission time, computation load, and transmission load. The current practice of manually selecting FL hyper-parameters puts a high burden on FL practitioners since various applications have different training preferences. In this paper, we propose FedTune, an automatic hyper-parameter tuning algorithm tailored to applications' diverse system requirements in FL training. FedTune is lightweight and flexible, achieving 8.48%-26.75% improvement for different datasets compared to using fixed FL hyper-parameters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call