Abstract
Sample time error can degrade the performance of time-interleaved analog to digital converters (TIADCs). A fully digital background algorithm is presented in this paper to estimate and correct the timing mismatch errors between four interleaved channels, together with its hardware implementation. The proposed algorithm provides low computation burden and high performance. It is based on the simplified representation of the coefficients of the Lagrange interpolator. Simulation results show that it can suppress error tones in all of the Nyquist band. Results show that, for a four-channel TIADC with 10-bit resolution, the proposed algorithm improves the signal to noise and distortion ratio (SNDR) and spurious-free dynamic range (SFDR) by 19.27 dB and 35.2 dB, respectively. This analysis was done for an input signal frequency of $$0.09f_s$$ . In the case of an input signal frequency of $$0.45f_s$$ , an improvement by 33.06 dB and 43.14 dB is respectively achieved in SNDR and SFDR. In addition to the simulation, the algorithm was implemented in hardware for real-time evaluation. The low computational burden of the algorithm allowed an FPGA implementation with a low logic resource usage and a high system clock speed (926.95 MHz for four channel algorithm implementation). Thus, the proposed architecture can be used as a post-processing algorithm in host processors for data acquisition systems to improve the performance of TIADC.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.