Abstract

We consider the evolution of magnetic fields under the influence of Hall drift and Ohmic decay. The governing equation is solved numerically, in a spherical shell with r_i/r_o = 0.75. Starting with simple free decay modes as initial conditions, we then consider the subsequent evolution. The Hall effect induces so-called helicoidal oscillations, in which energy is redistributed among the different modes. We find that the amplitude of these oscillations can be quite substantial, with some of the higher harmonics becoming comparable to the original field. Nevertheless, this transfer of energy to the higher harmonics is not sufficient to significantly accelerate the decay of the original field, at least not at the R_B = O(100) parameter values accessible to us, where this Hall parameter R_B measures the ratio of the Ohmic timescale to the Hall timescale. We do find clear evidence though of increasingly fine structures developing for increasingly large R_B, suggesting that perhaps this Hall-induced cascade to ever shorter lengthscales is eventually sufficiently vigorous to enhance the decay of the original field. Finally, the implications for the evolution of neutron star magnetic fields are discussed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call