Abstract

Dispersion of electromagnetic waves is usually described in terms of an integrodifferential equation. We show that whenever a differential operator can be found that annihilates the susceptibility kernel of the medium, dispersion can be modeled by a partial differential equation without nonlocal operators.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call