Abstract

1. Introduction. Consider the estimation of a probability density func- tion p(x) defined on a bounded interval. We approximate the logarithm of the density by a basis function expansion consisting of polynomials, splines or trigonometric series. The expansion yields a regular exponential family within which we estimate the density by the method of maximum likelihood. This method of density estimation arises by application of the principle of maxi- mum entropy or minimum relative entropy subject to empirical constraints. We show that if the logarithm of the density has r square-integrable deriva- tives, f IDr log p12 < ox, then the sequence of density estimators P converges to p in the sense of relative entropy (Kullback-Leibler fp log(p/ pn) at rate Opr(1/m2r + m/n) as m -X c and m2/n -* 0 in the spline and trigonometric cases and m3/n -O 0 in the polynomial case, where m is the dimension of the family and n is the sample size. Boundary conditions are assumed for the density in the trigonometric case. This convergence rate specializes to Op r(n-2r/(2r+ 1)) by setting m = nl/(2r+ 1) when the log-density is known to have degree of smoothness at least r. Analogous convergence results for the relative entropy are shown to hold in general, for any class of log-density functions and sequence of finite-dimensional linear spaces having L2 and L. approximation properties. The approximation of log-densities using polynomials has previously been considered by Neyman (1937) to define alternatives for goodness-of-fit tests, by Good (1963) as an application of the method of maximum entropy or minimum relative entropy, by Crain (1974, 1976a, b 1977) who demonstrates existence and consistency of the maximum likelihood estimator and by Mead and

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call