The computation offloading technique is a promising solution that empowers computationally limited resource devices to run delay-constrained applications efficiently. Vehicular edge computing incorporates the processing capabilities into the vehicles, and thus, provides computing services for other vehicles through computation offloading. Mobility affects the communication environment and leads to critical challenges for computation offloading. In this paper, we consider an intelligent task offloading scenario for vehicular environments including smart vehicles and roadside units, which can cooperate to perform resource sharing. Intending to minimize the average offloading cost which takes into account energy consumption together with delay in transmission and processing phases, we formulate the task offloading problem as an optimization problem and implement an algorithm based on deep reinforcement learning with Double Q-learning which allows user equipments to learn the offloading cost performance by observing the environment and make steady sequences of offloading decisions despite the uncertainties of the environment. Besides, concerning the high mobility of the environment, we propose a handover-enabled computation offloading strategy that leads to a better quality of service and experience for users in beyond 5G and 6G heterogeneous networks. Simulation results demonstrate that the proposed scheme achieves low-cost performance compared to the existing offloading decision strategies in the literature.
Read full abstract