Abstract

We present an Exponential Series Estimator (ESE) for multivariate densities. The ESE has an appealing information-theoretic interpretation and ensures positive density estimates. We show that if the logarithm of a density p(x) defined on a bounded support has r continuously differentiable derivatives, the ESE converges to p in the sense of Kullback-Leibler Information Criterion at the optimal minimax rate. The same rate is achieved for the integrated square error. We also derive the almost sure uniform convergence rate. We then establish the asymptotic normality of ESE. We undertake two sets of Monte Carlo experiments. The first experiment examines the performance using bivariate normal mixtures as investigated in Wand and Jones (1993). The second experiment estimates a copula density function. The results demonstrate that the ESE is an efficacious multivariate density estimator, especially for small sample sizes and copula estimation. An empirical application on the joint distribution of stock returns is presented.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.