Abstract

The well-known plasma resonance (transformation into plasma waves) at a critical density in an inhomogeneous plasma easily becomes relativistic nonlinear even at nonrelativistic laser intensities because the generated electrostatic field is significantly increased. Therefore, the widely used theory of harmonic generation by plasma resonance in a laser-produced plasma should be re-examined. We formulate the corresponding analytical theory of higher-order harmonic generation by nonrelativistic intense laser radiation propagating in a spatially inhomogeneous plasma. We find the spectral and angular characteristics of the harmonic radiation field and demonstrate the role of relativistic nonlinearity at plasma resonance in forming the harmonic spectra. The applicability range of the developed theory is determined by the plasma wave-breaking condition in the vicinity of the critical plasma density, which we analyze explicitly. The proposed theory is compared with the standard perturbation approach. Because the latter corresponds to low laser intensities, this comparison clearly shows the failure of the theory of harmonic generation via linear plasma resonance. The presented relativistic theory, which is applicable up to the laser intensity corresponding to plasma wave breaking, demonstrates the formation of a smooth power-law energy spectrum of higher-order laser harmonics in contrast to the standard perturbation theory. A spectral modulation of harmonics is also shown, which is a unique feature of relativistic nonlinearity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call