Abstract

Channel estimation plays a critical role in the system performance of wireless networks. In addition, deep learning has demonstrated significant improvements in enhancing the communication reliability and reducing the computational complexity of 5G-and-beyond networks. Even though least squares (LS) estimation is popularly used to obtain channel estimates due to its low cost without any prior statistical information regarding the channel, this method has relatively high estimation error. This paper proposes a new channel estimation architecture with the assistance of deep learning in order to improve the channel estimation obtained by the LS approach. Our goal is achieved by utilizing a MIMO (multiple-input multiple-output) system with a multi-path channel profile for simulations in 5G-and-beyond networks under the level of mobility expressed by the Doppler effects. The system model is constructed for an arbitrary number of transceiver antennas, while the machine learning module is generalized in the sense that an arbitrary neural network architecture can be exploited. Numerical results demonstrate the superiority of the proposed deep learning-based channel estimation framework over the other traditional channel estimation methods popularly used in previous works. In addition, bidirectional long short-term memory offers the best channel estimation quality and the lowest bit error ratio among the considered artificial neural network architectures.

Highlights

  • The exponential increases in wireless throughput for many different types of users with high quality of service demands have been predicted to continue in upcoming years [1].Fifth-generation (5G) and beyond wireless communication has been developed by integrating several disruptive technologies such as Massive MIMO, mmWave communications, and reconfigurable intelligent surfaces to handle the fast growth in wireless data traffic and reliability communications [2,3,4]

  • We evaluate the performance of the proposed deep learning-based channel estimations over the 5G channel profile and compare it with the traditional methods; i.e., least squares (LS) and linear MMSE (LMMSE)

  • The parameters used for the fully-connected deep neural network (FDNN) model, convolutional neural network (CNN) model, and bi-long short-term memory (LSTM) model are given in Tables 2–4, respectively

Read more

Summary

Introduction

The exponential increases in wireless throughput for many different types of users with high quality of service demands have been predicted to continue in upcoming years [1].Fifth-generation (5G) and beyond wireless communication has been developed by integrating several disruptive technologies such as Massive MIMO, mmWave communications, and reconfigurable intelligent surfaces to handle the fast growth in wireless data traffic and reliability communications [2,3,4]. OFDM is still deployed in 5G systems to combat the frequency selective fading effects, offering good communication quality in multi-path propagation environments [5]. To decode the desired signal effectively, the channel state information and its effects should be estimated and compensated at the receiver For this purpose, the pilot signals should be known to both the transmitter and receiver, which are exploited to perform the channel estimation. The MMSE estimation usually has high computational complexity since channel statistic information—i.e., the mean values and the covariance matrices of the propagation channels—is required. This statistical information is either extremely difficult to obtain or varies quickly in a short coherence time, making MMSE estimation challenging to implement [12,13]

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call