Abstract

Analog-to-digital converters (ADCs) with high sampling rates and output resolution are required for the design of mostly digital transceivers in emerging multi-Gigabit communication systems. A promising approach is to use a time-interleaved (TI) architecture with slower sub-ADCs in parallel, but mismatch among the sub-ADCs, if left uncompensated, can cause error floors in receiver performance. Conventional mismatch compensation schemes typically have complexity (in terms of number of multiplications) that increases with the desired resolution at the output of the TI-ADC. In this paper, we investigate an alternative approach, in which mismatch and channel dispersion are compensated jointly, with the performance metric being overall link reliability rather than ADC performance. For an OFDM system, we characterize the structure of mismatch-induced interference, and demonstrate the efficacy of a frequency-domain interference suppression scheme whose complexity is independent of constellation size (which determines the desired resolution). Numerical results from computer simulation and from experiments on a hardware prototype show that the performance with the proposed joint mismatch and channel compensation technique is close to that without mismatch. While the proposed technique works with offline estimates of mismatch parameters, we provide an iterative, online method for joint estimation of mismatch and channel parameters which leverages the training overhead already available in communication signals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call