Abstract

This paper addresses the problem of testing whether the Mahalanobis distance between a random signal Θ and a known deterministic model θ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">0</sub> exceeds some given non-negative real number or not, when Θ has unknown probability distribution and is observed in additive independent Gaussian noise with positive definite covariance matrix. When Θ is deterministic unknown, we prove the existence of thresholding tests on the Mahalanobis distance to θ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">0</sub> that have specified level and maximal constant power (MCP). The MCP property is a new optimality criterion involving Wald's notion of tests with uniformly best constant power ( UBCP) on ellipsoids for testing the mean of a normal distribution. When the signal is random with unknown distribution, constant power maximality extends to maximal constant conditional power (MCCP) and the thresholding tests on the Mahalanobis distance to θ <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">0</sub> still verify this novel optimality property. Our results apply to the detection of signals in independent and additive Gaussian noise. In particular, for a large class of possible model mismatches, MCCP tests can guarantee a specified false alarm probability, in contrast to standard Neyman-Pearson tests that may not respect this constraint.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call