Abstract

This paper extends the application of importance sampling to include the leading order effect of dispersive radiation on the soliton's phase when simulating bit errors in optical communication systems that use optical solitons as bit carriers. A simple one-parameter model for the radiation is used to account for the most significant effect of radiation on phase, a mean shift that scales with noise bandwidth. This improved model is used to inform optimal biasing of paths used for importance-sampled Monte Carlo simulations, with the resulting numerics demonstrating improved targeting of phase values.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call