Abstract

The rising cost of production testing for a system-on-a-chip (SoC) is one of the crucial matters to chip makers, due to long test time and costly automated-test-equipment. This paper proposes a spectral leakage-driven built-in self-test (BIST) scheme to precisely predict the nonlinearity of mixed-signal circuits in the loopback mode, thereby accomplishing cost-effectiveness (compared to previous BIST-based works). A digitally synthesized single-tone sinusoidal stimulus used for conventional harmonic testing is incoherently sampled by a device under test (DUT). The DUT output signal exhibits the correlation between the DUT harmonics and the spectral leakage introduced by the incoherent sampling. The DUT output signal is then fed to another DUT through a loopback path, so that the harmonics of a pair of DUTs are correlated with the spectral leakage on the loopback response; the magnitude of the spectral leakage is considered as a weighting factor on the harmonic magnitude of those DUTs. The correlation is quantitatively modeled as characteristic equations in (15) , and postprocessing predicts the harmonics of the two individual DUTs, by simultaneously solving the characteristic equations using on-chip DSP core available in an SoC. Simulation and hardware measurements validated that this paper can be practically used for production testing by showing less than 0.3 and 0.6 dB of the prediction errors, respectively.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.