Abstract

Vehicle edge computing (VEC) provides efficient services for vehicles by offloading tasks to edge servers. Notably, extant research mainly employs methods such as deep learning and reinforcement learning to make resource allocation decisions, without adequately accounting for the ramifications of high-speed mobility of vehicles and the dynamic nature of the Internet of Vehicles (IoV) on the decision-making process. This paper endeavours to tackle the aforementioned issue through the introduction of a novel concept, namely, a digital twin-assisted IoV. Among them, the digital twin of IoV offers training data for computational offloading and content caching decisions, which allows edge servers to directly interact with the dynamic environment while capturing its dynamic changes in real-time. Through this collaborative endeavour, edge intelligent servers can promptly respond to vehicular requests and return results. We transform the dynamic edge computing problem into a Markov decision process (MDP), and then solve it with the twin delayed deep deterministic policy gradient (TD3) algorithm. Simulation experiments demonstrate the adaptability of our proposed approach in the dynamic environment while successfully enhancing the Quality of Service, that is, decreasing total delay and energy consumption.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call