Abstract

Mobile-edge computing (MEC) technology offers computing resources for mobile devices to conduct computationally heavy activities by putting servers at the wireless mobile network’s edge. This mitigates the scarcity of computing resources in mobile devices and enhances the intelligence of the Internet of Things (IoT), which is a crucial technology for achieving industrial digitalization. Considering the time-varying channel as well as the time-varying available computing resources of MEC servers, this article formulates a hybrid optimization problem that combines task offload and resource allocation. The goal is to minimize MEC servers’ overall power consumption. Since the channel state information (CSI) stored in the MEC system is not real time, we propose a reinforcement learning (RL) algorithm for predicting current CSI from historical CSI and obtain the optimal strategy for task offloading. On the other hand, convex optimization methods are used to accomplish the dynamic resource allocation strategy. In addition, an approach based on deep RL (DRL) is put forward to overcome the dimensionality curse in RL algorithms. The simulation experiments illustrate that the proposed algorithms outperform the nonpredictive schemes by a large margin, and their performance is close to that of the optimum scheme, which utilizes simultaneous CSI.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call