Abstract

A wavelength-division-multiplexed (WDM) dispersion-managed (DM) optical fiber system is one of the key components of current development of ultrafast high-bit-rate optical communication lines. High capacity of optical transmission is achieved using both wavelength multiplexing and dispersion management (see e.g. Ref. [1, 2]). A dispersion-managed [3, 4, 5, 6] optical system is designed to create a low (or even zero) path-averaged dispersion by periodically alternating dispersion sign along an optical fiber that dramatically reduces pulse broadening. Second-order GVD (dispersion slope) effects and path-averaged GVD effects cause optical pulses in distinct WDM channels to move with different group velocities. Consequently modeling of WDM systems requires simulating a long time interval. Enormous computation resources are necessary to capture accurately the nonlinear interactions between channels which deteriorates bit-rate capacity. Here an efficient numerical algorithm is developed for massive parallel computation of WDM systems. The required computational time is inversely proportional to the number of parallel processors used. This makes feasible a full scale numerical simulation of WDM systems on a workstation cluster with a few hundred processors.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call