Abstract

The relativistic effects on resonance absorption, occurring when a high intensity p-polarized laser pulse is incident obliquely onto inhomogeneous plasma, are discussed by use of one-dimensional particle-in-cell simulation. It is found that the absorption rate decreases with the increase of the amplitude of the incident pulse until the laser intensity reaches about 3.4×1017W/cm2. This is mainly owing to the relativistic effect of the electrostatic field driven near the critical surface. At high light intensities, because of the relativistic effect of the laser pulse itself in plasma the plasma wave breaking, and the excitation of the parameter instabilities, the absorption begins to increase with the light intensity. For a given scale-length, similar intensity-dependence of the absorption is found for different incident angles and different initial electron temperatures.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call