It is shown that when the amplifier is driven near saturation, its inherent nonlinearity causes significant bit-pattern-dependent pulse distortion, particularly in the bit-rate range between about 2 and 32 GB/s. Without proper countermeasures, this distortion can degrade system performance appreciably due to two basic mechanisms. The first, which can result in a system power penalty of as much as 10 dB, occurs in a standard decision circuit that automatically sets the threshold voltage to the average signal level, rather than in the middle of the eye opening. The second mechanism, which occurs even with the threshold set properly, is due to the nonlinear enhancement of the simple linear intersymbol interference (ISI) within the receiver filter. For example, computations of system performance at 8 Gb/s using an RC filter that gives a quite acceptable 10% of eye closure under linear conditions show that when the amplifier is driven to its saturation output power level, this mechanism causes a system power penalty of about 1 dB, which increases to about 4.5 dB when the power is doubled. Interestingly, with the proper threshold setting, an ideal integrate-and-dump receiver, which introduces no ISI, is shown to suffer no power penalty due to amplifier nonlinearity.< <ETX xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">></ETX>
Read full abstract