Abstract

Vehicular edge computing (VEC) has greatly enhanced the quality of vehicle service with low latency and high reliability. However, in some areas not covered by roadside infrastructures or in cases when the infrastructures are damaged or fail, the offloaded tasks cannot have the chance to be performed. Even in the areas deployed with infrastructures, when a large number of offloaded tasks are generated, the edge servers may not be capable of processing them in time, owing to their computing resources constraint. Based on the above observations, we proposed the idea of parked vehicle cooperation in VEC, which uses roadside parked vehicles with underutilized computational resources to cooperate with each other to perform the compute-intensive tasks. Our approach aims to overcome the challenge brought by infrastructure lacking or failure and make up for the shortage of computing resources in VEC. In our approach, firstly, the roadside parked vehicles are managed as different parking clusters. Then, the optimal amount of resources required for each offloaded task is analyzed. Furthermore, a task offloading algorithm based on deep reinforcement learning (DRL) is proposed to minimize the total cost, which is composed of the task execution delay and the energy consumption overhead of the parked vehicles for executing the task. A large number of simulation results show that, compared with other algorithms, our approach not only has the highest task completion execution successful rate, but also has the lowest task execution cost.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call