This paper proposes a novel digital adaptive blind background calibration technique for the gain, timing skew, and offset mismatch errors in a time-interleaved analog-to-digital converter (TI-ADC). Based on the frequency-shifted basis functions generated only from the measured TI-ADC output, the three mismatch errors can be represented, extracted, and then subtracted from the TI-ADC output adaptively, which is quite different from the conventional methods (e.g., using a bank of adaptive FIR filters). The advantage of the proposed technique is that it only needs to know the measured output signal and the TI-ADC channel number, without needing any additional information; and it is applicable to any TI-ADCs with no limitations to the channel number, sub-ADC sampling rate, signal type, and so on. Specifically, the frequency-shifted signal generator employs the Hilbert transform and does not need extra finite impulse response filters. Extensive simulation and measurement results have been presented to demonstrate the effectiveness and superiority of the proposed technique, for single-tone, multi-tone, wideband modulated, and multi-band modulated signals. In the measurement for a 32GS/s 16-channel TI-ADC system, it shows that the proposed technique can improve the effective number of bit by 2 ~ 3 bit and the SFDR by 30 ~ 35 dB.