Abstract

<para xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> This paper presents the architecture and circuit design of a single chip 32 mm<formula formulatype="inline"><tex Notation="TeX">$^2$</tex></formula> 90 nm CMOS DSP transceiver for electronic dispersion compensation (EDC) of multimode fibers at 10 Gb/s, based on maximum likelihood sequence detection (MLSD). This is the first MLSD-based transceiver for multimode fibers and the first fully integrated DSP based transceiver for optical channels reported in the technical literature. The digital receiver incorporates equalization, Viterbi detection, channel estimation, timing recovery, and gain control functions. The analog front-end incorporates an 8-way interleaved ADC with self-calibration, a programmable gain amplifier, a phase interpolator, and the transmitter. Also integrated are a XAUI interface, the physical coding sublayer (PCS), and miscellaneous test and control functions. Experimental results using the stressors specified by the IEEE 10 GBASE-LRM standard <citerefgrp><citeref refid="ref1"/></citerefgrp>, as well as industry-defined worst-case fibers are reported. A sensitivity of <formula formulatype="inline"><tex Notation="TeX">${-}$</tex> </formula>13.68 dBm is demonstrated for the symmetric stressor of <citerefgrp> <citeref refid="ref1"/></citerefgrp> in a line card application with a 6 inch FR4 interconnect. </para>

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call