Abstract

In this work we investigate a new background calibration technique to compensate sampling phase errors in time-interleaved analog-to-digital-converters (TI-ADCs). Timing mismatches in TI-ADC degrade significantly the performance of ultra-high-speed digital transceivers. Unlike previous proposals, the calibration technique used here optimizes a metric directly related to the performance of the communication system. Estimation of gradient of the mean-squared-error (MSE) at the slicer with respect to the sampling phases of each interleave, are computed to minimize the time errors of the TI-ADC by controlling programmable analog time delay-cells. Since (i) dedicated digital signal processing (DSP) such as cross-correlations or digital filtering of the received samples are not required, and (ii) metrics such as MSE are available in most commercial transceivers, the implementation is reduced to a low speed state-machine. The technique is verified experimentally by using a programmable logic-based platform with a 2 GS/s 6-bit TI-ADC. The latter has been fabricated in $0.13μm CMOS process, and it provides flexible sampling phase control capabilities. Experimental results show that the signal-to-noise ratio penalty of a digital BPSK receiver caused by sampling time errors in TI-ADC, can be reduced from 1 dB to less than 0.1 dB at a bit-error-rate of 10 <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">-6</sup> .

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.