Abstract

The advancement of 5G technology has brought the prosperous development of Internet of Vehicles (IoV). IoV services are not only computational intensive but also extremely sensitive to the delay. As a promising computing paradigm, mobile edge computing (MEC) can be applied to IoV scenarios. However, due to the limited resources of a single MEC server, it is difficult to cope with the suddenly increased computation loads caused by emergencies, or the intensive resource requests from busy regions. Therefore, we propose a novel regional intelligent management vehicular system with dual MEC planes, in which MEC servers in the same region cooperate with each other to achieve resource sharing. We classify computing tasks into different types according to their delay tolerances and focus on the optimization problem of resource allocation for different type tasks. And then, we design a resource allocation algorithm based on deep reinforcement learning, which can adapt to the changeable MEC environment to process high-dimensional data. Simulation results confirm that our proposed scheme is feasible and effective.

Highlights

  • With the progress of 5G technology, the world has accelerated its pace into the 5G era

  • We mainly focus on the problem of resource allocation for Internet of Vehicles (IoV) computational tasks with deferent types in the mobile edge computing (MEC) based system

  • THE REGIONAL INTELLIGENT MANAGEMENT IOV SYSTEM To cope with the resource limitation of a single MEC server, we introduce the concept of multi-MEC servers collaboration to achieve resource sharing

Read more

Summary

INTRODUCTION

With the progress of 5G technology, the world has accelerated its pace into the 5G era. Xu: Regional Intelligent Resource Allocation in MEC-Based Vehicular Network multiple MEC servers To construct such a vehicular system, we mainly face the following challenges:. How to allocate appropriate computing resources for different tasks to satisfy their stringent delay constraints deserves to be well studied Traditional methods such as the game theory [3] and the genetic annealing algorithm [4] are puzzled by the complexity. We propose a novel regional intelligent management vehicular system with multi-tiers MEC servers, which can provide computation-intensive, mobility-aware and low-latency services. To minimize the delay, we consider the optimization problem as a Markov decision process and design an algorithm to allocate computational resources adaptively via deep reinforcement learning.

RELATED WORK
COMPUTING MODEL
PROBLEM FORMULATION AND SOLUTION
ACTION SPACE
Q-LEARNING TO DEEP Q-LEARNING
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call