Abstract

Jitter is one of the main factors affecting the bit error rate, and jitter decomposition is a crucial tool with which to characterize jitter at a given BER. In this article, we address this problem based on PointNet and propose a PointNet-based dual-Dirac model (PointNet-DD), where its input is the two-dimensional point cloud formed by the coordinate of the jitter histogram. In particular, we develop a feature extractor, where the stride size of the one-dimensional convolution layer of PointNet is changed to better learn the local features, the corresponding variance and mean features hidden under the jitter histogram point cloud are extracted by global max pooling and global average pooling, respectively. Then, we introduce a dual-Dirac model in the network for jitter calculation to make the estimated deterministic jitter and random jitter more accurate. Consequently, the PointNet-DD can improve the mean absolute error of jitter decomposition. Finally, this approach is practically tested on the test circuits. Experimental results show that the performance, robustness, space, and time of the proposed method are better than other methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call