Abstract

Estimation with assigned risk is a classical statistical problem, and the theory is well developed for the case of directly observed (no missing) data. In this article a more complicated problem of estimation of the spectral density in presence of missing data is considered. First, the corresponding theory of sequential estimation with minimal expected stopping time is developed. Then it is shown that a two‐stage estimator may be used and it yields the minimal stopping time. Sample size of the first stage may be deterministic and in order smaller than a minimal stopping time, and then the first stage defines the size of the second stage. Furthermore, the estimator adapts to unknown smoothness of an underlying spectral density and to an underlying missing mechanism.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call