It is well known that the ML estimator exhibits a threshold effect, i.e., a rapid deterioration of estimation accuracy below a certain signal-to-noise ratio (SNR) or number of snapshots. This effect is caused by outliers and is not captured by standard tools such as the Cramer-Rao bound (CRB). The search of the SNR threshold value (where the CRB becomes unreliable for prediction of maximum likelihood estimator variance) can be achieved with the help of the Barankin bound (BB), as proposed by many authors. The major drawback of the BB, in comparison with the CRB, is the absence of a general analytical formula, which compels one to resort to a discrete form, usually the Mcaulay-Seidman bound (MSB), requesting the search of an optimum over a set of test points. In this paper, we propose a new practical BB discrete form that provides, for a given set of test points, an improved SNR threshold prediction in comparison with existing approximations (MSB, Abel bound, Mcaulay-Hofstetter bound) at the expense of the computational complexity increased by a factor les ( <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">P</i> +1) <sup xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">3</sup> , where <i xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">P</i> is the number of unknown parameters. We have derived its expression for the general Gaussian observation model to be used in place of existing approximations.
Read full abstract