Abstract

With the continuous development of mobile edge computing, people are more willing to offload tasks to edge servers that are closer to users than cloud services for a better user experience. Due to problems such as user mobility and limited coverage of edge servers, how to ensure service quality and prevent service interruption is a problem we have to consider. This article explores the problem of service migration to decide when, where, and how to migrate an ongoing service from the current edge server to the target edge server. At the same time, we will consider the premise that edge servers can only obtain partial user information, and model the service migration problem as a partially observable Markov decision process. To minimize user delay and system energy consumption, we propose a service migration decision algorithm based on Deep recurrent Q-learning (DRQNSM). Multiple experiments show that our algorithm has a better performance than some classical reinforcement learning algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call