Abstract

Edge computing and network slicing are two key technologies to reduce communication latency and improve network flexibility in fog radio access network (F-RAN). Due to the existence of the massive potential offloading decisions, in this paper, we develop a joint computing offloading and resource allocation strategy to minimize the total energy consumption of the cloud-edge system. In order to meet the quality of service (QoS) of different devices, two different radio access network (RAN) slices are designed. Besides, considering the curse of dimensionality caused by the explosive growth of the UEs, we propose a deep Q-learning (DQN) algorithm, which uses value function approximation to compress the status dimension. Moreover, to reduce the complexity of the algorithm, the problem is divided into two subproblems, which are joint radio resource allocation and fog access point (FAP) selection problem and cloud side task forwarding problem and solved by DQN and greedy algorithm separately. Through simulation, we demonstrate that the method proposed in this paper can effectively reduce the total system energy consumption and shorten the convergence time.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call