Abstract

Large deviations theory is used to analyze the exponential rate of decrease of error probabilities for a sequence of decisions based on a test statistics sequence (T/sub n/). It is assumed that (for a given statistical hypothesis) the distributions of T/sub n/ are determined by some unknown member of a class of probability distributions. The worst case, or least favorably exponential rate of error probability decrease over this class, is sought. It is shown that the Legendre-Fenchel transform of the maximized cumulant function yields a lower bound for the minimized large deviations rate function, and that in many cases this bound is tight. Application of the result is illustrated by a detailed consideration of i.i.d memoryless detection with an epsilon -contamination distribution family. >

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call