Abstract

The sampling rate of an ADC often limits speed of a signal processing system. Sampling rate at the A/D interface can be increased by using multiple component ADCs that are time interleaved. Mismatches in offsets, gains, and sampling times among the component ADCs limit the performance of the ADC system. Previous time-interleaved ADC arrays use careful layout, foreground calibration and/or digital filters to minimize the effects of these mismatches. The presented time-interleaved ADC uses monolithic analog background calibration to match the gains and offsets of the component pipelined ADCs. The contributions are an expandible adaptive background calibration technique for parallel ADCs and a calibration loop that uses a mixed-signal integrator. The fully-differential prototype is fabricated in a 1.0 /spl mu/m CMOS single-poly process with poly-thin-oxide-diffusion capacitors. It includes 3 pipelined ADCs, one algorithmic ADC, the calibration signal generator, channel control logic, and 6 mixed-signal integrators, each followed by a unity-gain buffer that supplies the offset or reference correction voltage to one of the pipelined ADCs. The SC integrator and ADC stages use telescopic opamps with source followers at the input.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call