Abstract

The goal of this paper is the performance evaluation of a deep learning approach when deployed in fifth-generation (5G) millimeter wave (mmWave) multicellular networks. To this end, the optimum beamforming configuration is defined by two neural networks (NNs) that are properly trained, according to mean square error (MSE) minimization. The first network has as input the requested spectral efficiency (SE) per active sector, while the second network has the corresponding energy efficiency (EE). Hence, channel and power variations can now be taken into consideration during adaptive beamforming. The performance of the proposed approach is evaluated with the help of a developed system-level simulator via extensive Monte Carlo simulations. According to the presented results, machine learning (ML)-adaptive beamforming can significantly improve EE compared to the standard non-ML framework. Although this improvement comes at the cost of increased blocking probability (BP) and radiating elements (REs) for high data rate services, the corresponding increase ratios are significantly reduced compared to the EE improvement ratio. In particular, considering 21.6 Mbps per active user and ML adaptive beamforming, the EE can reach up to 5.3 Mbps/W, which is significantly improved compared to the non-ML case (0.9 Mbps/W). In this context, BP does not exceed 2.6%, which is slightly worse compared to 1.7% in the standard non-ML case. Moreover, approximately 20% additional REs are required with respect to the non-ML framework.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call