The proliferation of 5G networks has revolutionized wireless communication by delivering enhanced speeds, ultra-low latency, and widespread connectivity. However, in heterogeneous cloud radio access networks (H-CRAN), efficiently managing inter-cell interference while ensuring energy conservation remains a critical challenge. This paper presents a novel energy-efficient, dynamic enhanced inter-cell interference coordination (eICIC) scheme based on deep reinforcement learning (DRL). Unlike conventional approaches that focus primarily on optimizing parameters such as almost blank subframe (ABS) ratios and bias offsets (BOs), our work introduces the transmission power during ABS subframes (TPA) and the channel quality indicator (CQI) threshold of victim user equipments (CTV) into the optimization process. Additionally, this approach uniquely integrates energy consumption into the scheme, addressing both performance and sustainability concerns. By modeling key factors such as signal-to-interference-plus-noise ratio (SINR) and service rates, we introduce the concept of energy-utility efficiency to balance energy savings with quality of service (QoS). Simulation results demonstrate that the proposed scheme achieves up to 70% energy savings while enhancing QoS satisfaction, showcasing its potential to significantly improve the efficiency and sustainability of future 5G H-CRAN deployments.
Read full abstract