In this brief, a novel scheme for the derivation of triple errors associated with Time-Interleaved Analog-to-Digital Converters (TI-ADCs) has been presented. This algorithm relies on the utilization of reference Analog-to-Digital Converter (ADC) for the background calibration process. The main advantage of the proposed architecture is that it provides the possibility to calibrate offset, gain and timing skew by utilization of a simple structure. The objective has been accomplished by adjusting the corresponding coefficients of each error. At first, the main idea has been introduced by mathematical expressions. Then, the architecture for extraction of each error has been derived from the calculations, and the generalized scheme has been demonstrated which extracts the errors. Finally, system-level simulations were provided to indicate the correct behavior of the proposed algorithm for a 12-bit 4-channel TI-ADC. Based on simulation results, the maximum value of 81dB for SFDR after calibration process has been achieved when an average error of 11ps was considered for each channel at the sampling frequency of 8GHz and 1GHz single tone input frequency. Also, the corresponding maximum value for SNDR is 66dB.