Abstract
A novel ultrafast and low-cost pipelined analog-to-digital converter (ADC) testing and calibration method is proposed. The ADC nonlinearities are modeled as segmented parameters with interstage gain errors. During the test phase, a pure sine wave is sent as input and the model parameters are estimated from the output data with the system identification method. Significantly, fewer samples are required when compared to traditional histogram testing. The modeled errors are then removed from the digital output codes during the calibration phase. Extensive simulations have been run to verify the correctness and robustness of the proposed method. With just 4000 samples, a 12-bit ADC can be accurately tested and calibrated to achieve less than 1 least significant bit (LSB) integral nonlinearity (INL). Measurement results show that the ADC effective number of bits (ENOB) is improved from 9.7to 10.84 bits and the spurious-free dynamic range (SFDR) is improved by 20 dB after calibration. The chip is fabricated in 40-nm technology and consumes 10.71 mW at a sampling rate of 125 MS/s.
Accepted Version
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have