Abstract

This paper proposed a novel all-digital blind background calibration to mitigate timing mismatch in time-interleaved analog-to-digital converter (TIADC). In estimation module, the adoption of a subtraction-based error extraction function and the design of Variable-Step-Size Least Mean Squares algorithm contribute to reducing the computational complexity and enhancing the convergence speed with optimal output accuracy respectively. In compensation module, a dual-stage Taylor series expansion structure has been introduced to effectively maintain the overall output performance. The proposed architecture is applied to a 12-bit 3 GS/s four-channel TIADC model. Its effectiveness for single-tone and multi-tone signals is proven through systematical testing and analysis. The simulation results exhibit that the Spurious Free Dynamic Range is significantly improved by 54.53 dB in the single-tone signal case, and the timing mismatch is converged after 1000 samples. The proposed calibration circuit has been synthesized utilizing a 28 nm standard cell library for assessing its hardware consumption, area (0.051 mm2) and average power dissipation (67.5 mW) within the integrated chip architecture. Our technology provides a viable optimization solution to improve efficiency of TIADC in high-speed systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.