Abstract
This study investigates a new adaptive blind calibration structure for the calibration of gain and timing mismatch error, which is applied for a two-channel Time-Interleaved Analog-to-Digital Converter (TI-ADC). This technique calibrates the mismatch error in the output of the converter without any interrupt at the typical performance of the TI-ADC. The output of the time-interleaved ADC is approximated by modeling based on Taylor's series. The effect of gain and timing mismatch in the output is represented by the first order of Taylor's series. By using a Least-Mean Square (LMS) and correlation-based algorithm, the approximate coefficients of the Taylor's series are identified. Input signal and its chopped image or input signal and its chopped and delayed image will be correlated in the proposed algorithm. This algorithm can identify the coefficients blindly. It is simpler than the previous similar techniques. Also, this technique does not have the drawbacks of a technique which uses the Taylor's series approximation and filtered-X LMS algorithm. Simulation results confirm these claims and show that by entering six sinusoidal inputs in the whole bandwidth of the proposed technique, 48.2 dB improvement in Spur Free Dynamic Range (SFDR) is achieved.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.