Abstract

The linearity of a pipeline analog-to-digital converter (ADC) is mainly limited by capacitor mismatch and finite operational amplifier (OPAMP) gain, which cause large power and design difficulty in modern nanometer CMOS processes for high-resolution pipeline ADCs. It is a trend of developing digital calibration techniques to compensate the analog error in pipeline stages. This paper systematically introduces a novel interpolation-based digital calibration architecture to compensate both linear and nonlinear errors from pipeline stages. The new method does not require convergence. The effect of calibration error is analyzed in detail in this paper. A prototype 20-MS/s pipeline ADC is fabricated in a 0.35-μm 3.3-V CMOS process. For 12-b resolution, the digital calibration improves the ADC differential nonlinearity and integral nonlinearity from 1.47 LSB and 7.85 LSB to 0.2 LSB and 0.27 LSB. For a 590-kHz sinusoidal signal, the calibration improves the ADC signal-to-noise-distortion ratio and spurious-free dynamic range from 41.3 dB and 52.1 dB to 72.5 dB and 84.4 dB, respectively. With the new calibration technique, low-gain OPAMPs and small capacitors are used in the pipeline. The designed ADC has 0.78-pJ/step figure of merit (FOM), which is among the lowest reported FOMs for high-resolution pipeline ADC designs. The new architecture requires an accurate calibration ADC (CalADC) and two digital decoders. CalADC is implemented on-chip with 6.5% die area and 8.9% power. The decoders are synthesized to have 912 gates and consume 23.4% ADC power.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call