Abstract

Jitter is timing noise that causes bit errors in high-speed data transmission lines. If the data rate of a system is increased, the magnitude of jitter measured in seconds is roughly unchanged, but measured as a fraction of a bit period, it increases proportionally with the data rate and causes errors. Emerging technologies require the ratio of the number of errors to the total number of transmitted bits (the bit error rate, BER) to be less than one in a trillion (10/sup -12/). As datacom, bus, and backplane data rates have increased, many different techniques for characterizing jitter have been introduced, each using a variety of different types of laboratory equipment. To fix difficult jitter problems at high data rates, engineers need to understand the diverse jitter analysis techniques used in both synchronous and asynchronous networking. The article focuses on data rates of emerging technologies above 3 Gb/s. Below 3 Gb/s, real-time oscilloscopes can capture a contiguous data stream that can be analyzed simultaneously in both the time and frequency domains; at higher data rates jitter analysis is more challenging. This discussion is from the perspective of a digital engineer, drawing on the experience of synchronous optical network/synchronous digital hierarchy (SONET/SDH) where many challenges have already been addressed.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.