Abstract

Nowadays, more and more compute-intensive applications begin to appear on mobile devices. However, mobile devices may not be able to satisfy the computing requirements. Mobile Edge Computing (MEC) is proposed to address these issues. Mobile devices offload the computing tasks to the edge servers which accept tasks and handle them. And we think the edge servers should accomplish tasks as many as possible while minimizing total energy cost. However, users with mobile devices are usually keeping moving. In this situation, the energy consumption of tasks are constantly changing, and we need an efficient algorithm to dynamically determine how to offload tasks to a set of edge servers, to maximize the number of completed tasks while minimizing the energy consumption. The dynamic decision of task offloading is a multi-stage decision problem, and therefore we can model this problem as a Markov decision process (MDP). Because this problem involves a huge state space with high dimension, we propose a deep reinforcement learning (DRL) based mobile device task offloading algorithm to solve this problem. We also run extensive experiments to evaluate our task offloading approach against other typical task offloading approaches. The results of experiments show that our algorithm is superior to the baseline algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call