Abstract

Even though small portable devices are becoming increasingly more powerful in terms of processing power and power efficiency, there are still workloads that require more computational capacity than these devices offer. Examples of such workloads are real-time sensory input processing, video game streaming, and workloads relating to IoT devices. Some of these workloads such as virtual reality, however, require very small latency; hence, the workload cannot be offloaded to a cloud service. To tackle this issue, edge devices, which are closer to the user, are used instead of cloud servers. In this study, we explore the problem of assigning tasks from mobile devices to edge devices in order to minimize the task response latency and the power consumption of mobile devices, as they have limited power capacity. A deep Q-learning model is used to handle the task offloading decision process in mobile and edge devices. This study has two main contributions. Firstly, training a deep Q-learning model in mobile devices is a computational burden for a mobile device; hence, a solution is proposed to move the computation to the connected edge devices. Secondly, a routing protocol is proposed to deliver task results to mobile devices when a mobile device connects to a new edge device and therefore is no longer connected to the edge device to which previous tasks were offloaded.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call