Abstract
Channel state information (CSI) feedback estimation for a downlink medium in a massive multiple input multiple output (MIMO) system is an essential and critical task to improve channel capacity and performance yield, especially in a frequency division duplex (FDD) multiplexing system. However, spectral efficiency degradation is a massive issue due to high channel feedback overhead. This work proposes a deep learning-based channel estimation (DLCE) model to improve channel reconstruction efficiency and channel overhead reduction accuracy. The proposed deep learning (DL) mechanism consists of encoder and decoder network where encoder network is utilized to compress CSI matrices whereas decoder network is used to decompress obtained CSI matrices. Here, inverse discrete Fourier transform (IDFT) method is utilized to convert CSI matrices of frequency domain into CSI matrices of delay domain. Simulation results are evaluated between uplink and downlink medium in the massive MIMO system considering a co-operation in science and technology (COST) 2,100 model. Here, a significant improvement in correlation and normalized mean square error (NMSE) results is observed. The proposed DLCE model shows superior performance against varied channel estimation techniques in terms of NMSE and correlation efficiency.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: International Journal of Reconfigurable and Embedded Systems (IJRES)
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.