Abstract

As transportation becomes more convenient and efficient, users move faster and faster. When a user leaves the service range of the original edge server, the original edge server needs to migrate the tasks offloaded by the user to other edge servers. An effective task migration strategy needs to fully consider the location of users, the load status of edge servers, and energy consumption, which make designing an effective task migration strategy a challenge. In this paper, we innovatively proposed a mobile edge computing (MEC) system architecture consisting of multiple smart mobile devices (SMDs), multiple unmanned aerial vehicle (UAV), and a base station (BS). Moreover, we establish the model of the Markov decision process with unknown rewards (MDPUR) based on the traditional Markov decision process (MDP), which comprehensively considers the three aspects of the migration distance, the residual energy status of the UAVs, and the load status of the UAVs. Based on the MDPUR model, we propose a advantage-based value iteration (ABVI) algorithm to obtain the effective task migration strategy, which can help the UAV group to achieve load balancing and reduce the total energy consumption of the UAV group under the premise of ensuring user service quality. Finally, the results of simulation experiments show that the ABVI algorithm is effective. In particular, the ABVI algorithm has better performance than the traditional value iterative algorithm. And in a dynamic environment, the ABVI algorithm is also very robust.

Highlights

  • Nowadays, with the rapid development of smart mobile devices (SMDs) and online applications, users’ requirements for services is increasing

  • Between 0 and 19 min, there are relatively few tasks that need to be migrated for tasks offloaded by SMDs, so the total energy consumption of the unmanned aerial vehicle (UAV) group is growing at a slower rate

  • Because the traditional value iteration (TVI) algorithm and the Markov decision process (MDP)-SD algorithm both use distance as a priority condition when choosing target migration nodes, when the number of task migrations begins to slowly decrease, some UAVs that are closer to SMDs still consume high energy, some UAVs farther away from SMDs have lower energy consumption, which causes the range of residual energy state to start to fluctuate

Read more

Summary

Introduction

With the rapid development of smart mobile devices (SMDs) and online applications, users’ requirements for services is increasing. The research direction of this paper is not limited to one aspect, but considers the following three aspects comprehensively, namely the cost of task migration, the load status of the edge servers and the quality service of the users. The purpose of this paper is to design an efficient task migration strategy to achieve load balancing and reduce the total energy consumption of the unmanned aerial vehicle (UAV) group under the premise of ensuring user service quality. Based on Markov decision process (MDP), we established the Markov decision process with unknown rewards (MDPUR) model, in which the return function fully considers the load state and residual energy state of the UAVs. In addition, we designed the advantage-based value iteration (ABVI) algorithm combining the parent-generation crossover (PC) algorithm to get the optimal migration strategy.

Related Work
Communication Model
Delay Model
Energy Consumption Model
Problem Formulation
Mathematical Model of the MDPUR Algorithm
Reward Function
Value Function
Setting of Experimental Parameters
Analysis of the Total Energy Consumption of the UAV Group
Analysis of Task Migration Time
Analysis of the Load Status of the UAV Group
Analysis of the Flight Time of the UAV Group
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call