Abstract

An i.i.d. process X is considered on a compact metric space X . Its marginal distribution π is unknown, but is assumed to lie in a moment class of the form, P = { π : 〈 π , f i 〉 = c i , i = 1 , … , n } , where { f i } are real-valued, continuous functions on X , and { c i } are constants. The following conclusions are obtained: (i) For any probability distribution μ on X , Sanov’s rate-function for the empirical distributions of X is equal to the Kullback–Leibler divergence D ( μ ∥ π ) . The worst-case rate-function is identified as L ( μ ) ≔ inf π ∈ P D ( μ ∥ π ) = sup λ ∈ R ( f , c ) 〈 μ , log ( λ T f ) 〉 , where f = ( 1 , f 1 , … , f n ) T , and R ( f , c ) ⊂ R n + 1 is a compact, convex set. (ii) A stochastic approximation algorithm for computing L is introduced based on samples of the process X . (iii) A solution to the worst-case one-dimensional large-deviation problem is obtained through properties of extremal distributions, generalizing Markov’s canonical distributions. (iv) Applications to robust hypothesis testing and to the theory of buffer overflows in queues are also developed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call