Abstract

It has recently been proposed that geomagnetic jerks observed at the Earth's surface could be viewed as singularities in the time behavior of the geomagnetic field with a regularity of about 1.5 when wavelet analyzed. Such a signal should have suffered some distortion when diffusing from the core‐mantle boundary (CMB) through the conducting mantle. Assuming that the upper mantle is an insulator and given the electromagnetic time constant of the mantle, we compute the distortion that a pure singularity introduced at the CMB suffers as it traverses the mantle. We compute this distortion through its effects on the so‐called ridge functions extracted from the wavelet transform of the signal. This distortion is very similar to the small but significant one that we observe in real data. We therefore speculate that jerks must have been pure singularities at the base of the mantle and infer an average estimate for the mantle electromagnetic time constant from the way the signal is distorted by fitting the synthetic ridge functions to the experimental ones. Assuming, for example, a thickness of 2000 km for a uniform lower conducting mantle, we find an electrical conductivity smaller than 10 S m−1. This value is in reasonable agreement with values derived from high‐pressure experiments for a silicate mantle.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call