Abstract
A new digital background calibration technique for gain mismatches and sample-time mismatches in a Time-Interleaved Analog-to-Digital Converter (TI-ADC) is presented to reduce the circuit area. In the proposed technique, the gain mismatches and the sample-time mismatches are calibrated by using pseudo aliasing signals instead of using a bank of adaptive FIR filters which is conventionally utilized. The pseudo aliasing signals are generated and subtracted from an ADC output. A pseudo aliasing generator consists of the Hadamard transform and a fixed FIR filter. In case of a two-channel 10-bit TI-ADC, the proposed technique reduces the requirement for a word length of the FIR filter by about 50% without a look-up table (LUT) compared with the conventional technique. In addition, the proposed technique requires only one FIR filter compared with the bank of adaptive filters which requires (M-1) FIR filters in an M-channel TI-ADC.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: IEEE Transactions on Circuits and Systems I: Regular Papers
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.