Abstract

Abstract The ability to resolve sub-nanosecond phase shifts on internal IC nodes is central to the utility of electron beam test systems. In this work, the obtainable time delay resolution has been investigated both experimentally and via computer modeling. Initial time delay resolution results were obtained at working distances from 1 to 32 mm in a conventional electron beam test system. In order to obtain these results, a signal generation system was developed. This system proved to be adequate for resolution measurements down to 10 ps. Monte Carlo simulated waveforms were generated and used to develop a consistent method of determining the minimum detectable delay. This method, which is based on linear regression and confidence intervals, corresponded well with visual judgments of the resolution. The minimum detectable delays using this criteria ranged from 8 to 44 ps at a static beam current of 20 nA, a pulse width of 150 ps and a 100 kHz repetition rate. The degradation of resolution with increasing working distance was attributed to the transit time effect since there was no significant jitter noise and no increase in noise with working distance evident in the acquired waveforms.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.