Abstract

In fifth-generation (5G) communications, millimeter wave (mmWave) is one of the key technologies to increase the data rate. To overcome this technology's poor propagation characteristics, it is necessary to employ a number of antennas and form narrow beams. It becomes crucial then, especially for initial access, to attain fine beam alignment between a next generation NodeB (gNB) and a user equipment (UE). The current 5G New Radio (NR) standard, however, adopts an exhaustive search-based beam sweeping, which causes time overhead of a half frame for initial beam establishment. In this paper, we propose a deep learning-based beam selection, which is compatible with the 5G NR standard. To select a mmWave beam, we exploit sub-6 GHz channel information. We introduce a deep neural network (DNN) structure and explain how we estimate a power delay profile (PDP) of a sub-6 GHz channel, which is used as an input of the DNN. We then validate its performance with real environment-based 3D ray-tracing simulations and over-the-air experiments with a mmWave prototype. Evaluation results confirm that, with support from the sub-6 GHz connection, the proposed beam selection reduces the beam sweeping overhead by up to 79.3 %.

Highlights

  • After Long Term Evolution (LTE), fifth-generation (5G) cellular network technology has emerged to fulfill requirements to support various services including enhanced mobile broadband, ultra-reliable and low-latency communication (URLLC), and massive machine-type communications [1]

  • We propose a deep learning-based beam selection that, unlike prior works, exploits channel state information (CSI) of a sub-6 GHz channel to choose a millimeter wave (mmWave) beam

  • From the insight that the beams we considered are formed in the angular domain, it is natural to employ channel characteristics such as angle of departure (AoD) or power delay profile (PDP)

Read more

Summary

INTRODUCTION

After Long Term Evolution (LTE), fifth-generation (5G) cellular network technology has emerged to fulfill requirements to support various services including enhanced mobile broadband (eMBB), ultra-reliable and low-latency communication (URLLC), and massive machine-type communications (mMTC) [1]. Sim et al.: Deep Learning-Based mmWave Beam Selection for 5G NR/6G With Sub-6 GHz Channel Information networks, base stations, generation NodeBs (gNBs) in NR, are densely deployed, and directional antennas are employed. The former exploits the received signal directly, while the latter estimates channel characteristics Both methods have a limitation that a mmWave link has to be established before it gathers the input data, which does not fit with initial beam establishment. We propose a deep learning-based beam selection that, unlike prior works, exploits channel state information (CSI) of a sub-6 GHz channel to choose a mmWave beam. Results of ray tracing-based simulations and prototype-based experiments are presented in Section IV, and Section V presents our conclusions

SYSTEM MODEL AND PROBLEM FORMULATION
DEEP LEARNING-BASED BEAM SELECTION
PERFORMANCE EVALUATION
Findings
CONCLUDING REMARKS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call