Abstract

Channel state information (CSI) feedback estimation for a downlink medium in a massive multiple input multiple output (MIMO) system is an essential and critical task to improve channel capacity and performance yield, especially in a frequency division duplex (FDD) multiplexing system. However, spectral efficiency degradation is a massive issue due to high channel feedback overhead. This work proposes a deep learning-based channel estimation (DLCE) model to improve channel reconstruction efficiency and channel overhead reduction accuracy. The proposed deep learning (DL) mechanism consists of encoder and decoder network where encoder network is utilized to compress CSI matrices whereas decoder network is used to decompress obtained CSI matrices. Here, inverse discrete Fourier transform (IDFT) method is utilized to convert CSI matrices of frequency domain into CSI matrices of delay domain. Simulation results are evaluated between uplink and downlink medium in the massive MIMO system considering a co-operation in science and technology (COST) 2,100 model. Here, a significant improvement in correlation and normalized mean square error (NMSE) results is observed. The proposed DLCE model shows superior performance against varied channel estimation techniques in terms of NMSE and correlation efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call