Abstract

A lot of laser applications require propagation of extremely high average power radiation through Faraday isolators. Although many investigations are devoted to the thermal lens effect, polarization contamination has not been discussed in detail. However, the depolarization can significantly limit the isolation ratio. The physical reason for the self- induced depolarization is absorption of laser radiation. It results in spatial nonuniform distribution of temperature giving rise to two effects which reduce the isolation ratio: the temperature dependence of Verdet constant and birefringence ratio is sum of two terms which represent these two effects and the last phenomena is more efficient. In order to suppress the self-induced depolarization tow novel optical schemes was suggested and realized in experiment. The idea of compensating depolarization is to replace one 45 degree Faraday rotator by two 22.5 degrees rotators and a reciprocal optical element between them. The polarization distortions which a beam undertakes while passing the first rotator will be partially compensated in the second rotator. Both schemes allow to increase isolation ratio up to 100 times in comparison to the traditional scheme. Different ways of design of Faraday isolator for 1kW average power with isolation ratio about 30 dB are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call