Abstract

Deep integration of wireless power transmission and mobile edge computing (MEC) promotes wireless powered MEC to become a new research hotspot in the field of Internet of Things. In this paper, we focus on the joint optimization problem of online offloading decision and charging resource allocation for minimizing task accomplishing time in dynamic time-varying wireless channel scenarios. The optimal solution involves addressing a mixed integer programming problem in real time, which is proved to be NP-hard, and imposes nontrivial challenges to design with conventional optimization methods. To efficiently address this problem, we leverage the deep reinforcement learning (DRL) technology to propose an energy-aware online offloading algorithm called EAOO. EAOO algorithm learns empirically the online offloading decision policies via a well-designed DRL framework, and adopts the feasible solution region analysis method to implement the charging resource allocation. We further propose a novel feasible decision vector generation method, and incorporate the crossover and mutation technology to expand the offloading vector search space with the provable feasibility guarantee. Extensive experimental results show that, our EAOO algorithm outperforms existing baseline algorithms, and achieves near-optimal performance with low CPU execution latency, which well satisfies the practical requirements of real-time and efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call