Abstract
Mobile edge computing (MEC) is applied to 5G communication to meet the needs of large-scale data communication. Unmanned aerial vehicle (UAV) can be used as air base station to provide edge computing services for users in remote areas. In this paper, we consider a multi- UAV enabled MEC (i.e., Multi-UAV-MEC) network, where software defined network (SDN) is adopted to improve the quality of services (QoS) for all users, and we study the problem of task offloading and resource allocation. In the proposed network, UAV s act as MEC servers to provide computation offloading services for ground equipments (GEs), and a SDN controller is responsible for collecting network global information and providing offloading decision and resource allocation strategy for all GEs and UAV s. We aim to minimize the weighted sum of the task processing delay and energy consumption in the network, and the above problem is a mixed-integer and non-convex problem. To address this challenge, we transform the problem into a MDP, and propose a deep reinforcement learning (DRL)-based algorithm. The SDN controller is considered as a agent to learn the optimal offloading and resource allocation strategy. Simulation results show that the proposed DRL-based algorithm can achieve better performance than other baseline algorithms under different conditions.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have