Abstract

This paper presents an all-digital background calibration method for gain, time skew, and bandwidth mismatch in M-channel under-sampling time-interleaved analog-to-digital converters (TI-ADCs) systems. Firstly, the characteristics of offset, gain, time skew, and bandwidth mismatch on the TI-ADCs system are analyzed. Secondly, a parameter vector is constructed to correct gain, time skew, and bandwidth mismatch. Then, the constructed parameter vector is calculated with the bandpass fractional delay filter and least squares (LS) algorithm. Based on the bandpass fractional delay filter, the proposed technique can work for ultra-high frequency signals. Additionally, the constructed parameter vector has a smaller number of filter taps than the derivative filter or Hilbert filter. Therefore, fewer computing resources are used to correct the input signal after obtaining the proposed parameter vector. Finally, there are matrix inversions in the LS algorithm. Additionally, implementing matrix inversion within FPGA is complex. For this reason, solving a system of linear equations is used to replace matrix inversions. The LS algorithm is affected by quantization error and white Gaussian noise. The simulation results verify the effectiveness of the proposed algorithm when the SNR of sub-ADC is from 30 dB to 100 dB or the ENOB of sub-ADC is from 5-bit to 16-bit. They show that the proposed algorithm is not limited by the first sub-ADC Nyquist. Additionally, the measurement results show that the proposed method is effective in the actual time-Interleaved system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call