Abstract

As one of the key technologies of 5G, Cloud Radio Access Networks (C-RAN) with cloud BBUs (Base Band Units) pool architecture and distributed RRHs (Remote Radio Heads) can provide the ubiquitous services. When failure occurs at RRH, it can’t be alleviated in time and will lead to a significant drop in network performance. Therefore, the cell outage compensation (COC) problem for RRH in 5G C-RAN is very important. Although deep reinforcement learning (DRL) has been applied to many scenarios related to the self-organizing network (SON), there are fewer applications for cell outage compensation. And most intelligent algorithms are hard to obtain globally optimized solutions. In this paper, aiming at the cell outage scenario in C-RAN with the goal of maximizing the energy efficiency, connectivity of RRH while meeting service quality demands of each compensation user, a framework based on DRL is presented to solve it. Firstly, compensation users are allocated to adjacent RRHs by using the K-means clustering algorithm. Secondly, DQN is used to find the antenna downtilt and the power allocated to compensation users. Comparing to different genetic algorithms, simulation result shows that the proposed framework converges quickly and tends to be stable, and reaches 95% of the maximum target value. It verifies the efficiency of the DRL-based framework and its effectiveness in meeting user requirements and handling cell outage compensation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call