Abstract
Given a statistical model, a statistic on the model is sufficient if the Fisher metric of the induced model coincides with the original Fisher metric, according to the definition by Ay-Jost-Lê-Schwachhöfer. We introduce and study its quantitative version: for 0<δ≤1, we call a statistic δ-almost sufficient if δ2g(v,v)≤g′(v,v) for every tangent vector v of the parameter space, where g and g′ are the Fisher metric of the original and the induced model, respectively. By the monotonicity theorem due to Amari-Nagaoka and Ay-Jost-Lê-Schwachhöfer, the Fisher metric g′ of the induced model for such a statistic is bi-Lipschitz equivalent to the original one g, which means that the information loss of the statistic is uniformly bounded. We characterize such statistics in terms of the conditional probability or by the existence of a certain decomposition of the density function in a way similar to the characterizations of sufficient statistics due to Ay-Jost-Lê-Schwachhöfer and Fisher-Neyman.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have