Abstract
In this paper, a new class of lower bounds on the mean square error (MSE) of unbiased estimators of deterministic parameters is proposed. Derivation of the proposed class is performed by projecting each entry of the vector of estimation error on a Hilbert subspace of L2. This Hilbert subspace contains linear transformations of elements in the domain of an integral transform of the likelihood-ratio function. The integral transform generalizes the traditional derivative and sampling operators, which are applied on the likelihood-ratio function for computation of performance lower bounds, such as Cramer-Rao, Bhattacharyya, and McAulay-Seidman bounds. It is shown that some well-known lower bounds on the MSE of unbiased estimators can be derived from this class by modifying the kernel of the integral transform. A new lower bound is derived from the proposed class using the kernel of the Fourier transform. In comparison with other existing bounds, the proposed bound is computationally manageable and provides better prediction of the threshold region of the maximum-likelihood estimator, in the problem of single tone estimation.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.