Abstract

The rapid growth in broadband services is increasing the demand for high-speed optical communication systems. However, as the data rate increases, transmission impairments such as chromatic dispersion (CD) become prominent and require careful compensation. In addition, it is proposed that the next-generation optical networks will be intelligent and adaptive with impairment compensation that can be software-defined and re-programmed to adapt to changes in network conditions. This flexibility should allow dynamic resource reallocation, provide greater network efficiency, and reduce the operation and maintenance cost. Conventional dispersion compensating fiber (DCF) is bulky and requires careful design for each fiber link as well as associated amplifiers and monitoring. Recently, the advance of high-speed microelectronics, for example 30 GSamples/s analogue to digital converters (ADC) (Ellermeyer et al., 2008), has enabled the applications of electronic dispersion compensation (EDC) (Iwashita & Takachio, 1988; Winters & Gitlin, 1990) in optical communication systems at 10 Gbaud and beyond. The maturity in electronic buffering, computation, and large scale integration enables EDC to be more cost-effective, adaptive, and easier to integrate into transmitters or receivers for extending the reach of legacy multimode optical fiber links (Weem et al., 2005; Schube & Mazzini, 2007) as well as metro and long-haul optical transmission systems (Bulow & Thielecke, 2001; Haunstein & Urbansky, 2004; Xia & Rosenkranz, 2006, Bosco & Poggiolini, 2006; Chandrasekhar et al., 2006; Zhao & Chen, 2007; Bulow et al., 2008). Transmitter-side EDC (McNicol et al., 2005; McGhan et al., 2005 & 2006) exhibits high performance but its adaptation speed is limited by the round-trip delay. Receiver-side EDC can adapt quickly to changes in link conditions and is of particular value for future transparent optical networks where the reconfiguration of the addand drop-nodes will cause the transmission paths to vary frequently. Direct-detection maximum likelihood sequence estimation (DD MLSE) receivers are commercially available and have been demonstrated in various transmission experiments (Farbert et al., 2004; Gene et al., 2007; Alfiad et al., 2008). However, the performance of conventional EDC using direct detection (DD) is limited due to the loss of the signal phase information (Franceschini et al., 2007). In addition, the transformation of linear optical impairments arising from CD into nonlinear impairments after square-law detection significantly increases the operational complexity of the DD EDC. For example, DD MLSE was numerically predicted to achieve 700km single mode fiber (SMF) transmission at 10 Gbit/s but required 8192 Viterbi processor states (Bosco & Poggiolini, 2006).

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.