Abstract

SummaryIn this work, a deep learning (DL)‐based massive multiple‐input multiple‐output (mMIMO) orthogonal frequency division multiplexing (OFDM) system is investigated over the tapped delay line type C (TDL‐C) model with a Rayleigh fading distribution at frequencies ranging from 0.5 to 100 GHz. The proposed bi‐directional long short‐term memory (Bi‐LSTM) channel state information (CSI) estimator uses online learning during training and offline learning during the practical implementation phase. The design of the estimator takes into account situations in which prior knowledge of channel statistics is limited and targets excellent performance, even with limited pilot symbols (PS). Three separate loss functions (mean square logarithmic error [MSLE], Huber, and Kullback–Leibler Distance [KLD]) are assessed in three classification layers. The symbol error rate (SER) and outage probability performance of the proposed estimator are evaluated using a number of optimization techniques, such as stochastic gradient descent (SGD), momentum, and the adaptive gradient (AdaGrad) algorithm. The Bi‐LSTM‐based CSI estimator is trained considering a specific number of PS. It can be readily seen that by incorporating a cyclic prefix (CP), the system becomes more resilient to channel impairments, resulting in a lower SER. Simulations show that the SGD optimization approach and Huber loss function‐trained Bi‐LSTM‐based CSI estimator have the lowest SER and very high estimation accuracy. By using deep neural networks (DNNs), the Bi‐LSTM method for CSI estimation achieves a superior channel capacity (in bps/Hz) at 10 dB than long short‐term memory (LSTM) and other conventional CSI estimators, such as minimum mean square error (MMSE) and least squares (LS). The simulation results validate the analytical results in the study.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call