Abstract

Computation-intensive mobile applications are explosively increasing and cause computation overload for smart mobile devices (SMDs). With the assistance of mobile edge computing and mobile cloud computing, SMDs can rent computation resources and offload the computation-intensive applications to edge clouds and remote clouds, which reduces the application completion delay and energy consumption of SMDs. In this paper, we consider the mobile applications with task call graphs and investigate the task offloading and resource scheduling problem in hybrid edge-cloud networks. Due to the interdependency of tasks, time-varying wireless channels, and stochastic available computation resources in the hybrid edge-cloud networks, it is challenging to make task offloading decisions and schedule computation frequencies to minimize the weighted sum of energy, time, and rent cost (ETRC). To address this issue, we propose two efficient algorithms under different conditions of system information. Specifically, with full system information, the task offloading and resource scheduling decisions are determined based on semidefinite relaxation and dual decomposition methods. With partial system information, we propose a deep reinforcement learning framework, where the future system information is inferred by long short-term memory networks. The discrete offloading decisions and continuous computation frequencies are learned by a modified deep deterministic policy gradient algorithm. Extensive simulations evaluate the convergence performance of ETRC with various system parameters. Simulation results also validate the superiority of the proposed task offloading and resource scheduling algorithms over baseline schemes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call