Abstract
Mobile edge computing sinks computing and storage capabilities to the edge of the network to provide reliable and low-latency services. However, the mobility of users and the limited coverage of edge servers can cause service interruptions and reduce service quality. A cooperative edge caching strategy based on energy-latency balance is proposed to solve high power consumption and latency caused by processing computationally intensive applications. In the cache selection phase, the request prediction method based on a deep neural network improves the cache hit rate. In the cache placement stage, the objective function is established by comprehensively considering power consumption and latency, and We use the branch-and-bound algorithm to get the optimal value. We propose an improved service migration method to solve the problem of service interruption caused by user movement. The service migration problem is modeled using a Markov decision process (MDP). The optimization goal is to reduce service latency and improve user experience under the premise of specified cost and computing resources. Finally, the optimal solution of the model is solved by the deep Q-Network (DQN) algorithm. Experiments show that our edge caching algorithm has lower latency and energy consumption than other algorithms in the same conditions. The service migration algorithm proposed in this paper is superior to different service migration algorithms in migration cost and success rate.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.