Abstract

Orthogonal Frequency Division Multiplexing (OFDM) signal would be damaged significantly by inter- carrier interference (ICI) in higher time-varying fading channels which leads fatal degradation of bit error rate (BER) performance due to the loss of orthogonality among subcarriers. To solve this problem, this paper proposes an iterative based maximum likelihood demodulation (MLD) method which can achieve better BER performance with lower computation complexity even in higher time-varying fading channels. The features of proposed method are to employ a time domain training sequence (TS) in the estimation of channel impulse response (CIR) instead of using pilot subcarriers in the frequency domain and to employ a time domain equalization (TDE) method with a maximum likelihood (ML) estimation instead of using a conventional frequency domain equalization (FDE) method. This paper also proposes a low-complexity iterative method for solving the simultaneous equations in the MLD method instead of using an inverse matrix calculation. This paper presents various simulation results in higher time-varying fading channels to demonstrate the effectiveness of proposed method as comparing with the conventional frequency domain equalization method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call