Abstract

AbstractMeasurements obtained at ground‐based observatories are crucial to understanding the geomagnetic field and its secular variation (SV). However, current data processing methods rely on piecemeal closed‐source codes or are performed on an ad hoc basis, hampering efforts to reproduce data sets underlying published results. We present MagPySV, an open‐source Python package designed to provide a consistent and automated means of generating high‐resolution SV data sets from hourly means distributed by the Edinburgh World Data Centre. It applies corrections for documented baseline changes, and optionally, data may be excluded using the ap index, which removes effects from documented high solar activity periods such as geomagnetic storms. Robust statistics are used to identify and remove outliers. Developing existing denoising methods, we use principal component analysis of the covariance matrix of residuals between observed SV and that predicted by a global field model to remove a proxy for external field contamination from observations. This method creates a single covariance matrix for all observatories of interest combined and applies the denoising to all locations simultaneously, resulting in cleaner time series of the internally generated SV. In our case studies, we present cleaned data in two geographic regions: monthly first differences are used to investigate geomagnetic jerk morphology in Europe, an area previously well‐studied at lower resolution, and annual differences are investigated for northern high latitude regions, which are often neglected due to their high noise content. MagPySV may be run on the command line or within an interactive Jupyter notebook; two notebooks reproducing the case studies are supplied.

Highlights

  • These external fields act to induce secondary fields in the Earth, which vary in time with the inducing field

  • We have presented MagPySV, a Python package for obtaining magnetic observatory hourly means from World Data Centre (WDC) Edinburgh, processing the raw data, and removing external magnetic field contamination

  • The eigensystem method employed permits use of all data rather than discarding potentially useful data to reduce noise or by performing data selection using a geomagnetic index, which may be inappropriate because no single geomagnetic index is designed to capture all external field sources in all geographic regions

Read more

Summary

Introduction

These external fields act to induce secondary fields in the Earth, which vary in time with the inducing field. A method for denoising observatory means using principal component analysis (PCA), based on work by Wardinski and Holme (2011), Brown et al (2013), and Feng et al (2018), is presented in this work and implemented in the software This denoising method permits the use of higher temporal resolution data than have previously been used and allows better characterization of noise in different geographic regions than prior studies. The AUX_OBS hourly data set (called AUX_OBS_2) originally covered the era of near-continuous satellite observation from 1997 to present, updated every 3 months, to aid global field modeling efforts Both QD and definitive standard data are collated on a three monthly basis for as many observatories as possible and a thorough QC procedure applied, as described by Macmillan and Olsen (2013). Future updates will look to extend access to other data holdings and to implement access to other cadence data

Method
Initial Data Processing
Case Study I
Case Study II
Polar Cap
Auroral Zone
Subauroral Zone
Code Implementation and QC
Findings
Summary
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call