Abstract

It is often assumed that quantum effects will be significant only in the low temperature and high density plasmas. In this paper this assumption is challenged by considering the quantum contribution to the Landau damping of electron plasma waves in normal temperature and high density plasmas. Starting from the linearized Vlasov equation which contains the Bohm quantum potential, the dispersion relation of electron plasma waves propagating in a quantum plasma is derived. A linear Landau damping rate and equations for this process are also deduced. Result indicates that quantum effects enlarge effective frequency of plasmas, which is attributed to an increase in charge or number density of plasma electrons. As a result, Debye length is reduced, and the Debye screening effect becomes obvious. So the quantum behavior appears screening effect here. Landau damping rate is reduced by quantum effects and the exchange of energy between particles and waves is retarded.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call