Abstract

This paper presents a methodology of calibration to reduce the linearity error of gain in Time amplifier (TA). We divide the input range to several segments and calculate the average of maximum and minimum TA gain in each segment. We also have the checking points to check the width of input interval and decide which segment and corresponding TA gain is for the measured interval. The checking point is the maximum interval in each segment. The input range for each segment is 50 ps, that is, 0–50 ps is the first segment, 50–100 ps is the second segment, and so on. For the TA which the measured range is 700 ps, we divide the range to 14 segments and there is its corresponding TA gain in each segment. The gain error will be reduced from 0.82% to 0.33% and the distortion caused by gain error will be reduced from 5.74 ps to 0.51 ps.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call