Abstract

Mobile edge computing (MEC) has recently emerged as a promising technology to boost the integration ability of sensing, transmission and computation in industrial Internet of Things (IIoT). This paper investigates an MEC-enabled IIoT system, where multiple industrial devices may offload computation-intensive tasks to an edge server through wireless communication. We focus on the online offloading problem to optimize the tradeoff of the task accomplishing time and energy consumption. Time-varying wireless channels, random targeted task data sizes and dynamically changing residual energy as well as adaptively adjusted tradeoff weights make this problem highly challenging. Conventional optimization methods may lead to inefficient or even infeasible solutions. To efficiently tackle this problem, we leverage the deep reinforcement learning (DRL) technology to propose a time-energy tradeoff online offloading algorithm called TETO. In TETO, the online offloading decision policies are empirically learned via a well-designed DRL framework. TETO algorithm incorporates a stochastic strategy, the crossover and mutation technology and a novel feasible suboptimal offloading method to expand the offloading action search space with the provable feasibility guarantee. Extensive experimental results based on a real-world dataset show that, our TETO algorithm performs better than existing baseline algorithms, and obtains near-optimal performance with low CPU execution latency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call