Abstract

When the noise affecting time series is colored with unknown statistics, a difficulty for sinusoid detection is to control the true significance level of the test outcome. This paper investigates the possibility of using training data sets of the noise to improve this control. Specifically, we analyze the performances of various detectors {applied to} periodograms standardized using training data sets. Emphasis is put on sparse detection in the Fourier domain and on the limitation posed by the necessarily finite size of the training sets available in practice. We study the resulting false alarm and detection rates and show that standardization leads in some cases to powerful constant false alarm rate tests. The study is both analytical and numerical. Although analytical results are derived in an asymptotic regime, numerical results show that theory accurately describes the tests' behaviour for moderately large sample sizes. Throughout the paper, an application of the considered periodogram standardization is presented for exoplanet detection in radial velocity data.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.