Abstract

This paper proposes a novel digital adaptive blind background calibration technique for the gain, timing skew, and offset mismatch errors in a time-interleaved analog-to-digital converter (TI-ADC). Based on the frequency-shifted basis functions generated only from the measured TI-ADC output, the three mismatch errors can be represented, extracted, and then subtracted from the TI-ADC output adaptively, which is quite different from the conventional methods (e.g., using a bank of adaptive FIR filters). The advantage of the proposed technique is that it only needs to know the measured output signal and the TI-ADC channel number, without needing any additional information; and it is applicable to any TI-ADCs with no limitations to the channel number, sub-ADC sampling rate, signal type, and so on. Specifically, the frequency-shifted signal generator employs the Hilbert transform and does not need extra finite impulse response filters. Extensive simulation and measurement results have been presented to demonstrate the effectiveness and superiority of the proposed technique, for single-tone, multi-tone, wideband modulated, and multi-band modulated signals. In the measurement for a 32GS/s 16-channel TI-ADC system, it shows that the proposed technique can improve the effective number of bit by 2 ~ 3 bit and the SFDR by 30 ~ 35 dB.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.