Abstract

Realization of all-digital baseband receiver processing for multi-Gigabit communication requires analog-to-digital converters (ADCs) of sufficient rate and output resolution. A promising architecture for this purpose is the time-interleaved ADC (TI-ADC), in which several are employed in parallel. However, the timing mismatch between the sub-ADCs, if left uncompensated, leads to error floors in receiver performance. Standard linear digital mismatch compensation (e.g., based on the zero-forcing criterion) requires a number of taps that increases with the desired resolution. In this paper, we show that oversampling provides a scalable (in the number of sub-ADCs and in the desired resolution) approach to mismatch compensation, allowing elimination of mismatch-induced error floors at reasonable complexity. While the structure of the interference due to mismatch is different from that due to a dispersive channel, there is a strong analogy between the role of oversampling for mismatch compensation and for channel equalization. We illustrate the efficacy of the proposed mismatch compensation techniques for an OFDM receiver.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call