This paper presents digital background(BG) calibration techniques which are utilized to correct both inter-stage gain error (IGE) and inter-channel mismatches in a 12-bit 2GS/s 4-channel time-interleaved pipeline ADC. The inter-stage gain is mitigated by Least Mean Square (LMS) based calibration with a proposed Pseudo Noise (PN) codes injected into 1.5bit MDAC stages, which has neither input amplitude limitation nor comparator requirement. The inter-channel offset and gain mismatches are eliminated by Modified Moving Average (MMA) and LMS –based calibrations respectively. The inter-channel sampling time mismatch is estimated by correlation function and compensated by digital-controlled delay line. Simulation results in 40nm CMOS show that the presented calibration techniques improve the 2GS/s TI ADC SNDR from 46.2138dB to 6761.49dB 35dB and SFDR from 50.8150.09dB to 82.3574.04dB with a Nyquist input, consuming only 23.3mW power in a 184um*183um area.
Read full abstract