Abstract

Mobile Edge Computing (MEC) has already developed into a key component of the future mobile broadband network due to its low latency. In MEC, mobile devices can access data-intensive applications deployed at edge, which are facilitated by service and computing resources available on edge servers. However, it is difficult to handle such issues while data transmission, user mobility and load balancing conditions change constantly among mobile devices, edge servers and the cloud. In this paper, we propose an approach for formulating Data-intensive Application Edge Deployment Policy (DAEDP) that maximizes the latency reduction for mobile devices while minimizing the monetary cost for Application Service Providers (ASPs). The deployment problem is modelled as a Markov decision process, and a deep reinforcement learning strategy is proposed to formulate the optimal policy with maximization of the long-term discount reward. Extensive experiments are conducted to evaluate DAEDP. The results show that DAEDP outperforms four baseline approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call