Abstract
Inspired by mobile edge computing (MEC), vehicular edge computing (VEC) enables vehicle terminals to support resource-hungry on-vehicle applications with significantly lower latency and less energy consumption. In this paper, we investigate the computation offloading problem in a typical VEC scenario, where a vehicle offloads its computation tasks to the VEC servers deployed in the road side unit (RSU) to minimize its long-term user cost. The mobility of the vehicle coupled with the high dynamics of the environment makes the problem particularly difficult. To tackle this challenge, a deep reinforcement learning (DRL) based offloading method is proposed, which approximates the offloading policy (OP) by a deep neural network (DNN) and trains the DNN with the proximal policy optimization (PPO) algorithm without a priori knowledge of the environment dynamics. Extensive simulation experiments and comprehensive comparison with six baseline algorithms demonstrate that it can achieve the lowest user cost in most cases.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.