Abstract

Mismatches affect the dynamic performance of time-interleaved analog-to-digital converters (TIADCs). Linear mismatches can be calibrated by many mature methods, but if higher performance is required, nonlinearity mismatches have to be suppressed. The background calibration method based on input-free band (IFB) functions poorly for narrow-band signals. This brief proposes a correlation-based calibration method for nonlinearity mismatches in dual-channel TIADCs which behaves well for both wide-band and narrow-band signals. The output samples are calibrated by reducing the residual distortions which are approximated by multiplying the pseudo distortions and the estimated mismatch coefficients. The pseudo distortions are acquired by using a frequency-shifter, a differentiator, and multipliers. The coefficients which indicate the mismatch strength are estimated by eliminating the cross-correlation of the calibrated output samples and the calibrated pseudo distortions at zero lag. Simulations show that the proposed method can improve the SFDR by dozens of dBc for narrow-band input signals, compared with the IFB method. For the 16-QAM signal, the error vector magnitude improvement over the IFB method is 35.48 dB.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call