Abstract

Cloud radio access network (C-RAN) is a promising architecture to fulfill the ever-increasing resource demand in telecommunication networks. In C-RAN, a base station is decoupled into baseband unit (BBU) and remote radio head (RRH). The BBUs are further centralized and virtualized as virtual machines (VMs) inside a BBU pool. This architecture can meet the massively increasing cellular data traffic demand. However, resource management in C-RAN needs to be designed carefully in order to reach the objectives of energy saving and to meet the user demand over a long operational period. Since the user demands are highly dynamic in different times and locations, it is challenging to perform the optimal resource management. In this paper, we exploit a deep reinforcement learning (DRL) model to learn the spatial and temporal user demand in C-RAN, and propose an algorithm that resizes the VMs to allocate computational resources inside the BBU pool. The computational resource allocation is done according to the amount of required resources in the associated RRHs of the VMs. Through an extensive evaluation study, we show that the proposed algorithm can make the C-RAN network resource-efficiency while satisfying dynamic user demand.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call