This article presents a pipeline analog-to-digital converter (ADC) background calibration method that combines genetic algorithm (GA) and neural network (NN) algorithm. The proposed method uses ADC outputs or individual stage sub-ADC outputs for NN training, employs GA for global optimization of the NN's initial setup to avoid local optima traps, and utilizes a parallel pipeline architecture to create a high-throughput calibration circuit with optimized multiply-accumulator (MAC) to minimize resource consumption. Through simulation on a 6-stage 14-bit pipelined ADC model, the proposed method demonstrated superiority over traditional calibration techniques and other NN-based calibration strategies. Specifically, after calibration, the signal-to-noise ratio (SNDR), spurious-free dynamic range (SFDR), and effective number of bits (ENOB) are significantly improved from 57.72 dB, 59.77 dB, and 8.79 bits to 104.61 dB, 152.64 dB, and 17.08 bits, respectively.