Abstract

The proposed mobile edge computing can transfer the computing tasks in mobile applications to the nearby edge devices, effectively reducing the processing pressure of local servers and avoiding delays in backhaul and core networks, thus better solving the problem that cloud computing cannot effectively handle resource allocation. However, the complexity of the wireless environment during the communication process leads to the fact that the computing tasks are easily caused by the increase in the number of traffic and packet loss when they are uploaded to the MEC side through the wireless link, which cannot guarantee a lower total system energy consumption and shorter total delay. To address the problem that mobile devices cannot handle many computationally intensive tasks in a timely manner, this paper proposes a task offloading optimization scheme for SDN-enabled MEC environments. Modeling the computation offloading problem based on Lyapunov optimization, and then analyzes the offloading delay with respect to the offloading gain. In order to guarantee application requirements, minimize energy consumption and latency, and better satisfy user QoS requests, this paper proposes a resource allocation strategy based on deep reinforcement learning. The strategy designs a DQN-based resource allocation algorithm to deploy a joint optimal offloading decision and resource allocation scheme in a mobile edge computing environment under the limited computational resources and the latency constraints of the computational tasks. Based on the experimental results, it is shown that the proposed task offloading strategy can reduce the overall latency; the proposed resource allocation strategy can reduce the total energy consumption and total latency of the system and improve the successful execution rate of tasks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call