Abstract

Vehicular edge computing is an emerging enabler in strengthening driving efficiency and traffic safety. However, both performance bottlenecks and low resource efficiency of heterogeneous edge servers arise concurrently because of the inhomogeneous load distribution among the servers. Further, the unsaturated infrastructure coverage situation can deteriorate the concurrent issues. Although transmitting raw task data with large sizes among heterogeneous edge servers can relieve the concurrent issues, it distinctly degrades the core network’s efficiency, especially during rush hours. Meanwhile, it cannot settle the unsaturated coverage situation. To relieve the concurrent issues without degrading the core network’s efficiency, we introduce an aerial relay station that can flexibly relay vehicular tasks to nearby heterogeneous edge servers. The long-term task scheduling problem without any prior environment knowledge for the considered vehicular edge system is crucial but still up in the air. We formulate the system latency minimization problem as a partially observable stochastic game. Then a model-free multi-agent reinforcement learning algorithm is developed to search the real-time load-aware scheduling policy. Besides, we design a practical factor named offloading latency gain to assist the training process of the learning algorithm. Simulation experiments show that our proposed algorithm can better exploit idle computation resources of heterogeneous edge infrastructures and significantly reduce the average system latency up to 15-20% over existing algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call