Abstract
This article presents a pipeline analog-to-digital converter (ADC) background calibration method that combines genetic algorithm (GA) and neural network (NN) algorithm. The proposed method uses ADC outputs or individual stage sub-ADC outputs for NN training, employs GA for global optimization of the NN's initial setup to avoid local optima traps, and utilizes a parallel pipeline architecture to create a high-throughput calibration circuit with optimized multiply-accumulator (MAC) to minimize resource consumption. Through simulation on a 6-stage 14-bit pipelined ADC model, the proposed method demonstrated superiority over traditional calibration techniques and other NN-based calibration strategies. Specifically, after calibration, the signal-to-noise ratio (SNDR), spurious-free dynamic range (SFDR), and effective number of bits (ENOB) are significantly improved from 57.72 dB, 59.77 dB, and 8.79 bits to 104.61 dB, 152.64 dB, and 17.08 bits, respectively.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.