Abstract

The fifth generation (5G) technology is expected to support a rapid increase in infrastructure and mobile user subscriptions with an increase in the number of remote radio heads (RRHs) per unit area using cloud radio access networks (C-RANs). From the economic point of view, minimizing the amount of energy consumption of the RRHs is a challenging issue. From the environmental point of view, achieving “greenness” in wireless networks is one of the many goals of telecommunication operators. This paper proposes a framework to balance the energy consumption of RRHs and quality of service (QoS) satisfaction of users in cellular networks using a convolutional neural network (CNN)-based relational dueling deep Q-Network (DQN) scheme. Firstly, we formulate the cell activation/deactivation problem as a Markov decision process (MDP) and set up a two-layer CNN which takes raw captured images in the environment as its input. Then, we develop a dueling DQN-based autonomous cell activation scheme to dynamically turn RRHs on or off based on the energy consumption and QoS requirements of users in the network. Finally, we decouple a customized physical resource allocation for rate-constrained users and delay-constrained users from the cell activation scheme and formulate the problem as a convex optimization problem to ensure the QoS requirements of users are achieved with the minimum number of active RRHs under varying traffic conditions. Extensive simulations reveal that the proposed algorithm achieves faster rate of convergence than nature DQN, Q-learning and dueling DQN schemes. Our algorithm also achieves stability in mobility scenarios compared with DQN and dueling DQN without CNN. We also observe a slight improvement in balancing energy consumption and QoS satisfaction compared with DQN and dueling DQN schemes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call