Abstract

Time-interleaving (TI) is a major trend for high-speed ADC designs. The major limitation of TI-ADC is the mismatches among ADC channels. Calibration techniques have been actively pursued to compensate the mismatches. In this paper, we present a new spectral calibration technique for TI-ADC. Although the technique does not run in the background, it does not need external calibration signals and has no constraint on the input signal, which is similar to blind estimation. Compared to benchmark designs, the new technique involves much less calibration hardware overhead, and calibrates all mismatches with much less time. In practice, this efficient technique can run repetitively to track the environment changes. It addresses the needs from practical TI-ADCs well. The technique is verified with extensive simulations on a 2GS/s 12-bit 16-channel ADC. With mismatch spurs over -40dBc, the technique can reliably suppress the spurs to be lower than -80dBc within about 4000 samples without the need of iteration.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.