Abstract

SummaryAn analog background calibration approach is presented for the full calibration of pipeline analog‐to‐digital converters (ADCs). A well‐trained neural network acts close to the ideal 1.5‐bit stage, and its residue is compared with the real 1.5‐bit stage including gain error and amplifier nonlinearities. The detected error is used to compensate for the imperfect residue using four calibration polynomial coefficients. The corrected residue enters the second stage of the pipeline ADC and follows a normal path to achieve high resolution. The introduced structure is verified in a 12‐bit pipelined ADC composed of 11 stages; the first 10 stages have a 1.5‐bit structure, while the last stage is a 2‐bit flash. The sampling frequency is 100 MHz, and 10% non‐ideal factors (5% for each of the nonlinear and gain errors and 10% for the aggregated error) are considered for the first stage, while the input is 19.5 MHz sinusoidal waveform. A random noise is applied to the input to limit the effective number of bits (ENOB) to almost 11.8. The evaluation parameters of the ADC are extracted, signal‐to‐noise and distortion ratio increases from 39.14 to 72.91 dB, spurious free dynamic range improves from 40.94 to 79.69 dB, and the ENOB enhances from 6.2 to 11.82. The presented mechanism shows an acceptable accuracy in the high‐speed and high‐resolution ADCs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call