Abstract

Nowadays, mobile application services face the challenges of high speed, low latency and high reliability. The combination of digital twin (DT) technology and mobile edge computing (MEC) network can effectively solve these challenges. DT technology can help MEC network monitor and predict the network states. In this paper, we propose a DT-aided MEC network scenario with deep neural network (DNN) inference as the computing task of end devices (EDs). ED can offload part of DNN layers to MEC server. To allocate communication resources, we propose an algorithm based on asynchronous advantage actor-critic (A3C), which manages the transmission power and channel selection of EDs. Since DNN inference is continuous in real scenes, we consider the continuous DNN inference tasks. We convert the DNN optimal partition point solving problem to a min st-cut problem, and propose a graph theory based DNN optimal partition point solving algorithm to minimize the inference latency. Simulation results show that the proposed algorithm can effectively reduce the inference latency. Compared with actor-critic (AC) and deep Q network (DQN), the proposed algorithm has faster convergence speed and better convergence value. Compared with the traditional one-time DNN model partition algorithm, the proposed algorithm is more suitable for DNN continuous task arrival scenario.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call