Abstract

Vehicular Edge Computing (VEC) has gained popularity due to its ability to enhance vehicular networks. VEC servers located at Roadside Units (RSUs) allow low-power vehicles to offload computation-intensive and delay-sensitive applications, making it a promising solution. However, optimal resource allocation between edge servers is a complex issue due to vehicle mobility and dynamic data traffic. To address this issue, we propose a Lyapunov-based Multi-Agent Deep Deterministic Policy Gradient (L-MADDPG) method that jointly optimizes computing task distribution and radio resource allocation to minimize energy consumption and delay requirements. We evaluate the trade-offs between the performance of the optimization algorithm, queuing model, and energy consumption. We first examine delay, queue and energy models for task execution at the vehicle or RSU, followed by the L-MADDPG algorithm for jointly optimizing task offloading and resource allocation problems to reduce energy consumption without compromising performance. Our simulation results show that our algorithm can reduce energy consumption while maintaining system performance compared to existing algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call