Abstract

The mathematical theory of reproducing kernel Hilbert spaces (RKHSs) provides powerful tools for minimum variance estimation (MVE) problems. Here, we extend the classical RKHS-based analysis of MVE in several directions. We develop a geometric formulation of five known lower bounds on the estimator variance (Barankin bound, Cramér-Rao bound, constrained Cramér-Rao bound, Bhattacharyya bound, and Hammersley-Chapman-Robbins bound) in terms of orthogonal projections onto a subspace of the RKHS associated with a given MVE problem. We show that, under mild conditions, the Barankin bound (the tightest possible lower bound on the estimator variance) is a lower semicontinuous function of the parameter vector. We also show that the RKHS associated with an MVE problem remains unchanged if the observation is replaced by a sufficient statistic. Finally, for MVE problems conforming to an exponential family of distributions, we derive novel closed-form lower bounds on the estimator variance and show that a reduction of the parameter set leaves the minimum achievable variance unchanged.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call