Abstract

To meet the growing quest for enhanced network capacity, mobile network operators (MNOs) are deploying dense infrastructures of small cells. This, in turn, increases the power consumption of mobile networks, thus impacting the environment. As a result, we have seen a recent trend of powering mobile networks with harvested ambient energy to achieve both environmental and cost benefits. In this paper, we consider a network of virtualized small cells (vSCs) powered by energy harvesters and equipped with rechargeable batteries, which can opportunistically offload baseband (BB) functions to a grid-connected edge server depending on their energy availability. We formulate the corresponding grid energy and traffic drop rate minimization problem, and propose a distributed deep reinforcement learning (DDRL) solution. Coordination among vSCs is enabled via the exchange of battery state information. The evaluation of the network performance in terms of grid energy consumption and traffic drop rate confirms that enabling coordination among the vSCs via knowledge exchange achieves a performance close to the optimal. Numerical results also confirm that the proposed DDRL solution provides higher network performance, better adaptation to the changing environment, and higher cost savings with respect to a tabular multi-agent reinforcement learning (MRL) solution used as a benchmark.

Highlights

  • Due to an exponential growth in mobile traffic demand [1], dense heterogeneous networks (HetNets) of multi-tier base stations (BSs) are being deployed as a means of enhancing capacity

  • Relying on multi-access edge computing (MEC), a flexible functional split between SBSs and a centralized baseband (BB) unit pool [5] has been proposed, where part of the BB processes are executed at the SBSs, while the remainder is offloaded to a central BB unit (BBU) pool

  • In [9], we have proposed an offline solution for performance bounds of dynamic selection of functional split options for virtualized small cells (vSCs) powered by energy harvesting (EH)

Read more

Summary

INTRODUCTION

Due to an exponential growth in mobile traffic demand [1], dense heterogeneous networks (HetNets) of multi-tier base stations (BSs) are being deployed as a means of enhancing capacity. In [9], we have proposed an offline solution for performance bounds of dynamic selection of functional split options for vSCs powered by EH. We propose a distributed DRL (DDRL) algorithm for the dynamic control of functional split options in vSCs with EH capabilities, where each vSC is modeled as a distinct DRL-based agent that takes decisions in coordination with other vSC agents. As opposed to tabular MRL, DDRL allows to coordinate the policies of the learning agents via local state information exchanges without facing practically infeasible state-action tables. We formulate a network wide sequential decision making problem in order to optimally leverage flexible functional split options at the vSCs with the goal of minimizing both the grid energy consumption and the amount of dropped traffic.

RELATED WORK
REFERENCE ARCHITECTURE
Problem statement
Power model
EH and Demand Profiles
Background
DDRL-based control
States
Actions
Reward
Simulation Scenario
Training Analysis
Policy Characteristics
Network Performance
Policy Validation
Energy Savings and Cost Analysis
CONCLUSIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call