Abstract

Various interdependent and computationally intensive on-vehicle tasks have posed great pressure on the computing power of vehicles. Vehicular edge computing (VEC) is considered to be a promising paradigm to solve this problem. However, due to the high mobility, vehicles will pass through multiple road-side units (RSUs) during task computing. How to coordinate the offloading decision of RSUs is a challenge. In this study, we propose a dependent task offloading scheme by considering vehicle mobility, service availability, and task priority. Meanwhile, to coordinate the offloading decisions among the RSUs, a Markov decision process (MDP) is carefully designed, in which the action of each RSU is divided into three steps to decide whether, where, and how each task is offloaded separately. Then, an advanced DDPG-based deep reinforcement learning (DRL) algorithm is adopted to solve this problem. Simulation results show that the proposed scheme has better performance in reducing task processing latency and consumption.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call