Abstract

In this work an iterative time domain Least Squares (LS) based channel estimation method using superimposed training (ST) for a Multiple Input Multiple Output Orthogonal Frequency Division Multiplexing (MIMO-OFDM) system over time varying frequency selective fading channels is proposed. The performance of the channel estimator is analyzed in terms of the Mean Square Estimation Error (MSEE) and its impact on the uncoded Bit Error Rate (BER) of the MIMO-OFDM system is studied. A new selection criterion for the training sequences that jointly optimizes the MSEE and the BER of the OFDM system is proposed. Chirp based sequences are proposed and shown to satisfy the same. These are compared with the other sequences proposed in the literature and are found to yield a superior performance. The sequences, one for each transmitting antenna, offers fairness through providing equal interference in all the data carriers unlike earlier proposals. The effectiveness of the mathematical analysis presented is demonstrated through a comparison with the simulation studies. Experimental studies are carried out to study and validate the improved performance of the proposed scheme. The scheme is applied to the IEEE 802.16e OFDM standard and a case is made with the required design of the sequence.

Highlights

  • Channel estimation in MIMO-OFDM is a challenging task

  • In this work experimental studies are carried out to study the performance of the proposed scheme for an superimposed training (ST)-based MIMO-OFDM system, and its comparison with existing schemes is presented

  • The Mean Square Estimation Error (MSEE) and the Bit Error Rate (BER) simulation results presented are obtained by averaging over an ensemble of 1000 Monte Carlo iterations

Read more

Summary

Introduction

Channel estimation in MIMO-OFDM is a challenging task. Conventionally this is done by making use of the comb type pilots, block type pilots, or orthogonal pilots as discussed in [1,2,3]. In the superimposed training case, unlike the time multiplexed training case, the channel estimation and symbol detection processes are coupled to each other As a result, these ST sequences may lead to a strong interference in a few subcarriers leading to a serious BER performance degradation. The most important contribution of this work is the proposition of a criterion and a set of optimal training sequences satisfying the same for ST-based MIMO-OFDM systems Towards this end, the performance of the channel estimate of the proposed scheme is mathematically analyzed in terms of the MSEE and its impact on the BER of the MIMO-OFDM system is studied. Superscripts (·)T , (·)H , and (·)∗ denote the operations matrix transpose, Hermitian, and complex conjugation respectively, and · denotes 2 norms

Superimposed Training in MIMO-OFDM Systems
Performance Analysis
Chirp-Based Optimal Training Sequences
Experimental Results
Optimal Training Sequences
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call