Abstract

In the industrial Internet, Mobile Edge Computing (MEC) can provide the ability to transfer a large number of delay-sensitive and compute-intensive tasks to MEC servers, thus improving Quality of Service (QoS). Considering Unmanned Aerial Vehicles (UAVs) have the advantages of wide communication coverage and low deployment cost, UAVs have great potential to be employed as aerial base stations to provide computation resources for Intelligent Mobile Devices (IMDs). Due to the limited computation resources and energy of IMDs, we designed a multi-UAV-assisted MEC system in ground cells. To minimize the weighted sum of task completion delay and energy consumption, and ensure the QoS requirements of IMDs, we jointly consider the dynamic channel state, renewable energy utilization, UAVs trajectory, and tasks offloading ratio. To solve the non-convexity problem of complex high-dimensional states, we propose a model-free Deep Reinforcement Learning (DRL) offloading scheme based on the Twin Delayed Deep Deterministic Policy Gradient (TD3) algorithm. Moreover, we adopt Federated Learning (FL) to train DRL models to enhance the robustness of the model and the security of IMDs data. Meanwhile, the real environment is modeled as Digital Twin (DT) to monitor network changes and train the local DRL model, and the central cloud server can obtain the local model in real-time to aggregate the global model. Extensive experimental numerical results show that the proposed algorithm improves the system energy efficiency and reduces task completion delay.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call