Abstract

With the emergence of new vehicular applications, computation offloading based on mobile edge computing (MEC) has become a promising paradigm in resource-constrained vehicular networks. However, an unreasonable offloading strategy in offloading can cause serious energy consumption and latency. A real-time energy-aware offloading scheme for vehicle networks, based on MEC, is proposed to optimize communication and computation resource to decrease energy consumption and latency. Because the problem of computation offloading and resource allocation is the mixed-integer nonlinear problem (MINLP), this article uses a bi-level optimization method to transform the original MINLP into two subproblems. Furthermore, considering the mobility of vehicle users (V-UEs) and the availability of cloud resources, an offloading scheme based on deep reinforcement learning (DRL) is adopted to help users make the optimal offloading decisions. The simulation results show that the proposed bi-level optimization algorithm reduces the total overhead by nearly 40% to the compared algorithm.

Highlights

  • With the continuous progress of the sixth-generation (6G) communication and the Internet of vehicles [1,2,3,4,5,6], novel mobile applications in vehicular networks have promoted demand for the low-delay high-quality services, for example, interactive gaming, augmented reality/virtual reality (AR/VR), face recognition, and natural language processing [7]. e demand of computing is prominent, so that it frequently exceeds the capacity that local mobile devices can provide [8]

  • To find out the solution based on Markov decision process (MDP) offloading problem, we propose an online learning scheme based on the model-free deep RL algorithm, called deep Q-network (DQN) [37]

  • vehicle users (V-UEs) are connected to the Road Side Units (RSUs) by orthogonal frequency-division multiple access (OFDMA) technology, and they suffer from the interference of neighboring RSUs

Read more

Summary

Introduction

With the continuous progress of the sixth-generation (6G) communication and the Internet of vehicles [1,2,3,4,5,6], novel mobile applications in vehicular networks have promoted demand for the low-delay high-quality services, for example, interactive gaming, augmented reality/virtual reality (AR/VR), face recognition, and natural language processing [7]. e demand of computing is prominent, so that it frequently exceeds the capacity that local mobile devices can provide [8]. It is critical to make an efficient offloading decision and study the trade-off between the energy consumption of vehicle units and the latency of the corresponding tasks. Based on the above discussion, in this article, we propose a real-time energy-aware offloading scheme to study the trade-off between the energy consumption and the task latency of the vehicle units, which will optimize the allocation of communication and computing resources. Erefore, based on the above discussions, we propose a real-time energy-aware offloading scheme to study the trade-off between the energy consumption and latency (transmission latency and execution latency) of the vehicle units, which optimizes the allocation of communication and computing resource. For V-UEs, we present a real-time energy-aware offloading scheme that combines with computation and communication resource allocation to minimize the weighted sum of energy consumption and latency.

Related Works
System Model and Problem Formulations
Optimal Computation Offloading and Resource Allocation Scheme
Simulation Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call