Abstract

The amplification (attenuation) factor of an electromagnetic wave during the scattering of a relativistic electron by a nucleus in a moderately strong field of a circularly polarized electromagnetic wave is studied theoretically. The effect of amplification of an electromagnetic field is discovered in a certain interval of polar angles of the incident electron; this interval of angles essentially depends on the electron energy and the field intensity. It is shown that the amplification of a field attains its maximum for nonrelativistic electrons in the range of medium fields. As the electron energy increases, the amplification decreases and vanishes for ultrarelativistic electrons. An increase in the field intensity for a given electron energy also leads to a slow decrease in the amplification of a field. At high intensities of the wave, the effect of amplification vanishes. It is shown that, in the range of optical frequencies for medium fields (F ∼ 106V/cm), the amplification factor of laser light may amount to about μ ∼ 10−1 cm−1 for sufficiently high-power electron beams.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call