Abstract

We analyze theoretically the polarization properties of a partially coherent optical field that propagates in a nonlinear Kerr medium. We consider the standard model of two resonantly coupled nonlinear Schrödinger equations, which account for a wave-vector mismatch between the orthogonal polarization components. We show that such a phase-mismatch is responsible for the existence of a spontaneous repolarization process of the partially incoherent optical field during its nonlinear propagation. The repolarization process is characterized by an irreversible evolution of the unpolarized beam towards a highly polarized state, without any loss of energy. This unexpected result contrasts with the commonly accepted idea that an optical field undergoes a depolarization process under nonlinear evolution. The repolarization effect can be described in details by simple thermodynamic arguments based on the kinetic wave theory: It is shown to result from the natural tendency of the optical field to approach its thermal equilibrium state. The theory then reveals that it is thermodynamically advantageous for the optical field to evolve towards a highly polarized state, because this permits the optical field to reach the ???most disordered state???, i.e., the state of maximum (nonequilibrium) entropy. The theory is in quantitative agreement with the numerical simulations, without adjustable parameters. The physics underlying the reversible property of the repolarization process is briefly discussed in analogy with the celebrated Joule???s experiment of free expansion of a gas. Besides its fundamental interest, the repolarization effect may be exploited to achieve complete polarization of unpolarized incoherent light without loss of energy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call