Abstract

Optical powers used in optical telecommunications networks have increased over time with the advent of technologies such as erbium-doped fibre amplifiers, dense WDM, and fibre Raman amplifiers. Erbium fibre and Raman amplifiers with output powers up to 1 W and 2 W respectively are now available for deployment giving optical power densities of ≈ 14 000 MW/m2 per Watt launched into a single mode fibre. This compares with 74 MW/m2 for the surface of the sun. It is thus essential to ensure that the risk of optical damage to the fibre is understood and steps are taken to mitigate it where necessary. Research carried out by BT over the last few years has shown that fibres carrying relatively modest optical powers, as low as 200 mW for the most sensitive fibres, can be catastrophically damaged at tight fibre bends. Damage is caused by some of the optical power lost at the bend being absorbed by the fibre coating that may over time lead to a sudden increase in temperature to over 700 °C causing either the silica to kink mimicking a fibre break, or the coating to burn off at the bend leading to fibre failure on subsequent handling. This paper describes factors that increase the risk of short-term catastrophic damage at fibre bends, presents experimental results to show the variation in sensitivity between different fibres, and discusses new important theoretical results modelling the temperature rise in the fibre.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call