Abstract

It is not fully understood how electromagnetic waves propagate through plasma density fluctuations when the size of the fluctuations is comparable with the wavelength of the incident radiation. In this paper, the perturbing effect of a turbulent plasma density layer on a traversing microwave beam is simulated with full-wave simulations. The deterioration of the microwave beam is calculated as a function of the characteristic turbulence structure size, the turbulence amplitude, the depth of the interaction zone and the size of the waist of the incident beam. The maximum scattering is observed for a structure size on the order of half the vacuum wavelength. The scattering and beam broadening was found to increase linearly with the depth of the turbulence layer and quadratically with the fluctuation strength. Consequences for experiments and 3D effects are considered.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call