Abstract

In the present study the change of the jitter of the timing method based on the comparison of the original signal and the stretched and attenuated one, was investigated as a function of the ratio of the stretched and original signal. In the case of single CR differentiated and double RC integrated signals it was found, both on the base of theoretical and of experimental investigations, that if the time reference point is on the trailing edge of the pulse, the more the ratio mentioned approaches 1, the smaller the jitter is. On the other hand on the leading edge a minimum was experienced in different ratios depending on the power spectrum of the noise at the output of the system consisting of nuclear detector and preamplifier, whereas experimental investigations carried out in the case of semi-Gaussian signals shaped with a Gaussian shaper, make a small minimum likely even on the training edge.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call