Abstract

In the stringent timing requirements of high-speed systems, the effect of jitter on the bit-error rate and overall timing error can no longer be overlooked. As jitter timing budgets are normally specified in the form of individual jitter components, a fast and reliable jitter decomposition technique plays an integral part in the design and verification of high-speed data links. This article presents a machine learning approach to extract the random and deterministic jitter components from the timing histograms of the eye diagram. Training data are generated using the dual-Dirac model, and a preprocessing technique is proposed using vector fitting to create a rational function representation of the jitter probability distribution function. Numerical results indicate that the proposed method is able to perform jitter decomposition accurately and generalize well under different circumstances. In addition, results show that the preprocessing step is able to accelerate the training process and improve the overall performance of the neural networks significantly.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call