Abstract

Spin fluctuations in antiferromagnets typically lead to a reduction in sublattice magnetisation (i.e spin reduction) which can be modelled using spin-wave theory. The effect is most pronounced near phase transitions such as the spin flop and is dependent both on the strength and direction of an applied magnetic field and on the temperature. Previously it has been assumed for simplicity that the applied field was parallel to an anisotropy axis; however, mean-field calculations suggest that small misalignments of the field may have a large effect on the results. The authors give the generalised spin-wave theory for the spin reduction in isotropically coupled antiferromagnets with second-order (i.e. axial or orthorhombic) anisotropy in arbitrary applied fields.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call