A New Technique for Gain and Timing Calibration in Time-Interleaved ADCs Using an Estimation Signal
Abstract—This paper proposes a new technique to identify the mismatch error in time-interleaved analog-to-digital converters. This is done by injecting an estimation signal comprising of two parts; one section for identifying gain mismatch error and another one for timing mismatch error. Passing through the TI-ADC's channels, the estimation signal is influenced by the non-ideal sampling, which distorts the signal by adding some spurs in the output. This technique detects the mismatch error by extracting the spurs attached to the estimation signal with no interruption in data conversion. Since this method does not depend on feedback has high speed convergence. A simple gain and derivative filter are employed to compensate the identified mismatch error. Simulation results show that SFDR is improved by 23.2 dB in a two-channel TI-ADC. Moreover, the gain and timing mismatches are identified with just 265 samples after simulation running. The ripples on the identified error are maximally 0.4 % of the precise mismatch error of the converter.
Email Address of Submitting Authori.firstname.lastname@example.org
ORCID of Submitting Authorhttps://orcid.org/0000-0002-0757-1258
Submitting Author's InstitutionNo Institution
Submitting Author's Country