Channel estimation is still a challenge for space time block coding (STBC) multiple-input multiple-output orthogonal frequency division multiplexing (MIMO-OFDM) systems in time-varying environments. To estimate the channel state information (CSI) precisely without increasing complexity in any significant way, this paper utilizes the sparsity and the inherent temporal correlation of the time-varying wireless channel, and proposes a novel channel estimation method applied to STBC MIMO-OFDM systems. The proposed method consists of two schemes: adaptive multi-frame averaging (AMA) and improved mean square error (MSE) optimal threshold (IMOT). First, the temporal correlation of the time-varying channel is modeled by a linear Gauss-Markov (LGM) model, and the AMA scheme is incorporated to refine the initial estimated channel impulse response (CIR) through noise reduction. Based on the LGM model, the optimal average frame number is adaptively determined by minimizing the MSE of the denoised CIR. Then, the sparsity of the wireless channel is utilized to model the CIR as a sparse vector, and the IMOT scheme is performed to further remove the noise effect by discarding most of the noise-only CIR taps. Specifically, the IMOT scheme is achieved by recovering the CIR support across the optimal “tap-to-tap” threshold derived by minimizing the MSE of each CIR tap. Moreover, the prior confidence level of the tap to be active is calculated through multi-frame statistics to further improve the performance of the IMOT scheme. Simulation results verify that the proposed AMA-IMOT channel estimation method can achieve better performance than comparison methods.
Read full abstract