Abstract
Heterogeneous cloud radio access network (H-CRAN) is a promising technology to help overcome the traffic density which in 5G communication networks. One of the main challenges that will arise in H-CRAN is how to minimize energy consumption. In this paper, a deep reinforcement learning method is used to minimize energy consumption. Firstly, we propose an autonomous cell activation framework and customized physical resource allocation schemes to balance energy consumption and QoS satisfaction in C-RANs. We formulate the cell activation problem as a Markov decision process(MDP). To solve the problem, we develop a dueling deep Q-network (DQN) based autonomous cell activation framework to ensure user QoS demand and minimized energy consumption with the minimum number of active RRHs under varying traffic demand. Simulation results illustrate the effectiveness of our proposed solution in minimized energy consumption in a network.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have