Abstract

Modern scientific instruments readily record various dynamical phenomena at high frequency and for extended durations. Spanning timescales across several orders of magnitude, such “high-throughput” (HTP) data are routinely analyzed with parametric models in the frequency domain. However, the large size of HTP datasets can render maximum likelihood estimation prohibitively expensive. Moreover, HTP recording devices are operated by extensive electronic circuitry, producing periodic noise to which parameter estimates are highly sensitive. This article proposes to address these issues with a two-stage approach. Preliminary parameter estimates are first obtained by a periodogram variance-stabilizing procedure, for which data compression greatly reduces computational costs with minimal impact to statistical efficiency. Next, a novel test with false discovery rate control eliminates most periodic outliers, to which the second-stage estimator becomes more robust. Extensive simulations and experimental results indicate that for a widely used model in HTP data analysis, a substantial reduction in mean squared error can be expected by applying our methodology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call