Abstract

The massive deployment of Small Base Stations (SBSs) represents one of the most promising solutions adopted by 5G cellular networks to meet the foreseen huge traffic demand. The usage of renewable energies for powering the SBSs attracted particular attention for reducing the energy footprint and, thus, mitigating the environmental impact of mobile networks and enabling cost saving for the operators. The complexity of the system and the variability of the harvesting process suggest the adoption of learning methods. Here, we investigate techniques based on the Layered Learning paradigm to control dense networks of SBSs powered solely by solar energy. In the first layer, SBSs locally select switch ON/OFF policies according to their energy income and traffic demand based on a Heuristically Accelerated Reinforcement Learning method. The second layer relies on an Artificial Neural Network that estimates the network load conditions to implement a centralized controller enforcing local agent decisions. Simulation results prove that the control of the proposed framework mimics the behavior of the upper bound obtained offline with Dynamic Programming. Moreover, the proposed layered framework outperforms both a greedy and a distributed Reinforcement Learning solution in terms of throughput and energy efficiency under different traffic conditions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.