Abstract

Owing to their limited computing power and battery level, wireless users (WUs) can hardly handle compute-intensive workflows by the local processor. Multi-access edge computing (MEC) servers attached to base stations have ample computing power and communication resources, which can be used to address the computation tasks or workloads of WUs. In this study, we design a framework with multiple static and vehicle-assisted MEC servers to handle the workloads offloaded by WUs. For obtaining the optimal computation offloading scheme to minimize the weighted sum cost, including transmission and execution cost, energy consumption cost, and communication bandwidth cost, we model the offloading decision optimization problem as a Markov decision process (MDP). Then, we propose a partial computation offloading scheme based on reinforcement learning (RL) to address the absence of priori knowledge. The proposed scheme can learn the optimal offloading decision based on stochastic workload arrival, the changing channel state, and the dynamic distance between WUs and the edge servers. Moreover, to avoid the curse of dimensionality caused by the complex state and action spaces, we present an improved computation offloading method based on deep RL (DRL) to learn the optimal offloading policy using deep neural networks. Extensive numerical results illustrate that the proposed algorithms based on RL and DRL can autonomously learn the optimal computation offloading policy with no priori knowledge, and their performance are better than that of four baselines algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call