Abstract

Nonorthogonal multiple access (NOMA) is a promising multiple access scheme for 5G wireless networks. However, NOMA faces several challenges that still need to be solved optimally. Deep learning algorithms have been proposed as a potential solution to address these challenges. This review provides an overview of the use of deep learning algorithms to optimize NOMA performance in 5G networks. An investigation is conducted on how deep learning methods are applied in NOMA systems for resource allocation, channel estimation and detection, successive interference cancellation, and user clustering.They can learn optimal user clustering, optimal allocation, and interference alignment strategies, eventually boosting the network performance. In addition, deep learning algorithms can learn the complex relationships between the transmitted symbols and the received signal, leading to accurate detection of the superimposed signals. Opportunities and challenges in NOMA can be identified based on existing research showing how applying deep learning algorithms is better than conventional approaches. The main contribution of this review is to provide insights into the potential of deep learning algorithms to remarkably improve NOMA performance in 5G networks. This article is also a valuable resource for researchers and practitioners interested in using deep learning algorithms for NOMA in 5G networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call