Abstract

With the development of wireless network technologies, such as LTE/5G, Mobile Cloud Computing (MCC) has been proposed as a solution for mobile devices that need to carry out high-complexity computation with limited resources. Technically, with MCC, high-complexity computation tasks are offloaded from mobile devices to cloud servers. However, MCC does not work well for time-sensitive mobile applications due to the relatively long latency between mobile devices and cloud servers. Mobile Edge Computing (MEC), is expected to solve the problem with MCC. With MEC, edge servers, instead of cloud servers, are deployed at the edge of the network to provide offloading services to mobile devices. Since edge servers are much closer to mobile devices, the resulting latency is significantly lower. Despite the advantages of MEC over MCC, edge servers are not as resource-abundant as cloud servers. Consequently, when many offloaded tasks arrive at an edge server, admission control needs to be in place to arrive at the best performance. In this paper, we propose a Deep Reinforcement Learning (DRL) based admission control scheme, DAC, to maximize the system throughput of an edge server. Our experimental results indicate that DAC outperforms the existing admission control schemes for MEC in terms of system throughput.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call