Abstract

In Federated Learning (FL), a global statistical model is developed by encouraging mobile users to perform the model training on their local data and aggregating the output local model parameters in an iterative manner. However, due to limited energy and computation capability at the mobile devices, the performance of the model training is always at stake to meet the objective of local energy minimization. In this regard, Multi-access Edge Computing (MEC)-enabled FL addresses the tradeoff between the model performance and the energy consumption of the mobile devices by allowing users to offload a portion of their local dataset to an edge server for the model training. Since the edge server has high computation capability, the time consumption of the model training at the edge server is insignificant. However, the time consumption for dataset offloading from mobile users to the edge server has a significant impact on the total time consumed to complete a single round of FL process. Thus, resource management in MEC-enabled FL is challenging, where the objective is to reduce the total time consumption while saving the energy consumption of the mobile devices. In this article, we formulate an energy-aware resource management for MEC-enabled FL in which the model training loss and the total time consumption are jointly minimized, while considering the energy limitation of mobile devices. In addition, we recast the formulated problem as a Generalized Nash Equilibrium Problem (GNEP) to capture the coupling constraints between the radio resource management and dataset offloading. To that end, we analyze the impact of the dataset offloading and computing resource allocation on the model training loss, time, and the energy consumption. Finally, we present the convergence analysis of the proposed solution, and evaluate its performance against the traditional FL approach. Simulation results demonstrate the efficacy of our proposed solution approach.

Highlights

  • Federated Learning (FL) builds a statistical model by allowing mobile users to train local models on datasets residing at their mobile devices [1]

  • By allowing the mobile users to offload a portion of their local datasets to the edge server, the performance of the global model can be preserved while saving the energy consumption of the mobile devices

  • Since the users need to upload the local datasets to the edge server, the total time consumption of the proposed model is higher than the traditional FL but it can be minimized by the decent resource management approach

Read more

Summary

INTRODUCTION

Federated Learning (FL) builds a statistical model by allowing mobile users to train local models on datasets residing at their mobile devices [1]. By allowing the mobile users to offload a portion of their local datasets to the edge server, the performance of the global model can be preserved while saving the energy consumption of the mobile devices Inline with this idea, the works in [17] and [16] proposed the local data sharing mechanism for FL. The learning model, dataset offloading, local computing, and uplink radio resources management are jointly optimized to minimize the training loss and time consumption for one global round, ensuring the energy constraints of the mobile devices. Extensive simulations are performed to compare the performance of the proposed MEC-enabled FL and traditional FL in terms of the learning model, time, and energy consumption for the dataset offloading, and local computing resource management. The model aggregation is carried out after the edge training and the weight transmission of all mobile devices

COMMUNICATION MODEL
FEDERATED LEARNING MODEL
LOCAL TRAINING MODEL
EDGE TRAINING MODEL
DATASET OFFLOADING PROBLEM
UPLINK RESOURCE MANAGEMENT PROBLEM
GNEP FORMULATION FOR TIME MINIMIZATION
SIMULATION RESULTS
CONCLUSION
ENERGY CONSUMPTION OF USERS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call