Abstract
This paper examines the asymptotic behavior of the posterior distribution of a possibly nondifferentiable function g(θ), where θ is a finite-dimensional parameter of either a parametric or semiparametric model. The main assumption is that the distribution of a suitable estimator θ̂n, its bootstrap approximation, and the Bayesian posterior for θ all agree asymptotically.It is shown that whenever g is locally Lipschitz, though not necessarily differentiable, the posterior distribution of g(θ) and the bootstrap distribution of g(θ̂n) coincide asymptotically. One implication is that Bayesians can interpret bootstrap inference for g(θ) as approximately valid posterior inference in a large sample. Another implication—built on known results about bootstrap inconsistency—is that credible intervals for a nondifferentiable parameter g(θ) cannot be presumed to be approximately valid confidence intervals (even when this relation holds true for θ).
Highlights
This paper studies the posterior distribution of a real-valued function g(θ), where θ is a parameter of finite dimension in either a parametric or semiparametric model
This paper studied the asymptotic behavior of the posterior distribution of parameters of the form g(θ), where g(·) is locally Lipschitz continuous but possibly nondifferentiable
We have shown that the bootstrap distribution of g(θn) and the posterior of g(θ) are asymptotically equivalent
Summary
This paper studies the posterior distribution of a real-valued function g(θ), where θ is a parameter of finite dimension in either a parametric or semiparametric model. The distinction between the local Lipschitz property and directional differentiability emphasized in our main result is not just a technical refinement We believe that such a distinction is practically useful, for example, when conducting Bayesian estimation and inference of the bounds of the identified set in partially identified models, as recently suggested by Kline and Tamer (2016) and Giacomini and Kitagawa (2018). Decision-theoretic optimality of the Bayes estimator can be attached to the bootstrap-based estimator for g(θ) in large samples irrespective of g(θ) being differentiable or not This means that Bayesians can use bootstrap draws to conduct approximate posterior estimation/inference for g(θ), if computing θn is simpler than Markov Chain Monte Carlo (MCMC) sampling.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.