Abstract

Abstract: 5G is poised to support new emerging service types that help in the realization of futuristic applications. These services include enhanced Mobile BroadBand (eMBB), ultra-Reliable Low Latency Communication (uRLLC), and massive MachineType Communication (mMTC). 5G New Radio (NR) is envisioned to efficiently support ultra-reliable low-latency communication (URLLC) for new services and application with high reliability, availability and low latency such as factory automation and autonomous vehicles. 5G promises massive increases in traffic volume and data rates. Next generation wireless networks are expected to be extremely complex due to their massive heterogeneity in terms of the types of network architectures they incorporate, the types and numbers of smart IoT devices they serve, and the types of emerging applications they support. In such large-scale and, radio resource allocation and management (RRAM) becomes one of the major challenges encountered during system design and deployment. In this context, emerging Deep Reinforcement Learning (DRL) techniques are expected to be one of the main enabling technologies to address the RRAM in future wireless networks. The paper provides a detailed analysis of the impact of various parameters on the system performance, including the number of users, the signal-tointerference-plus-noise ratio, and. The proposed approach has the potential to significantly improve the performance of 5G networks and enable new applications and services that require high data rates, low latency, and reliable communication. We propose an algorithm for data bearers in millimeter wave (mmWave) frequency band.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call