Abstract

In order to enhance the effective resolution of time-interleaved analog-to-digital converters (TI-ADCs), both linear and nonlinear channel mismatches should be carefully calibrated. This paper concentrates on a bandwidth-efficient background calibration method for nonlinear errors in M-channel TI-ADCs. It utilizes the least-mean square algorithm as well as a certain degree of oversampling to achieve adaptive mismatch tracking. The calibration performance and computational complexity are investigated and evaluated through behavioral-level simulations. Furthermore, a calibration strategy for narrow-band input signals is proposed and verified as an improvement of the basic calibration structure for such signals.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call