Abstract

Automotive-Industry 5.0 will use Beyond Fifth-Generation (B5G) communications to provide robust, abundant computation resources and energy-efficient data sharing among various Intelligent Transportation System (ITS) entities. Based on the vehicle communication network, the Internet of Vehicles (IoV) is created, where vehicles’ resources, including processing, storage, sensing, and communication units, can be leveraged to construct Vehicular Cloudlet (VC) to realize resource sharing. As Connected and Autonomous Vehicles (CAV) onboard computing is becoming more potent, VC resources (comprising stationary and moving vehicles’ idle resources) seems a promising solution to tackle the incessant computing requirements of vehicles. Furthermore, such spare computing resources can significantly reduce task requests’ delay and transmission costs. In order to maximize the utility of task requests in the system under the maximum time constraint, this paper proposes a Secondary Resource Allocation (SRA) mechanism based on a dual time scale. The request service process is regarded as M/M/1 queuing model and considers each task request in the same time slot as an agent. A Partially Observable Markov Decision Process (POMDP) is constructed and combined with the Multi-Agent Reinforcement Learning (MARL) algorithm known as QMix, which exploits the overall vehicle state and queue state to reach effective computing resource allocation decisions. There are two main performance metrics: the system’s total utility and task completion rate. Simulation results reveal that the task completion rate is increased by 13%. Furthermore, compared with the deep deterministic policy optimization method, our proposed algorithm can improve the overall utility value by 70% and the task completion rate by 6%.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call