Abstract

We establish presumably optimal rates of normal convergence with respect to the Kolmogorov distance for a large class of geometric functionals of marked Poisson and binomial point processes on general metric spaces. The rates are valid whenever the geometric functional is expressible as a sum of exponentially stabilizing score functions satisfying a moment condition. By incorporating stabilization methods into the Malliavin-Stein theory, we obtain rates of normal approximation for sums of stabilizing score functions which either improve upon existing rates or are the first of their kind. Our general rates hold for functionals of marked input on spaces more general than full-dimensional subsets of $\mathbb{R}^d$, including $m$-dimensional Riemannian manifolds, $m\leq d$. We use the general results to deduce improved and new rates of normal convergence for several functionals in stochastic geometry, including those whose variances re-scale as the volume or the surface area of an underlying set. In particular, we improve upon rates of normal convergence for the $k$-face and $i$th intrinsic volume functionals of the convex hull of Poisson and binomial random samples in a smooth convex body in dimension $d\geq 2$. We also provide improved rates of normal convergence for statistics of nearest neighbors graphs and high-dimensional data sets, the number of maximal points in a random sample, estimators of surface area and volume arising in set approximation via Voronoi tessellations, and clique counts in generalized random geometric graphs.

Highlights

  • Let (X, F) be a measurable space equipped with a σ -finite measure Q and a measurable semi-metric d : X × X → [0, ∞)

  • We are interested in quantitative central limit theorems for stabilizing functionals, whereas laws of large numbers are shown in [26, 29] and moderate deviations are considered in [13]

  • Statistics Hs and Hn typically describe a global property of a random geometric structure on X in terms of local contributions exhibiting spatial interaction and dependence

Read more

Summary

Introduction

Let (X, F) be a measurable space equipped with a σ -finite measure Q and a measurable semi-metric d : X × X → [0, ∞). In the setting X = Rd , we expect that all of the statistics Hs and Hn described in [7, 25, 28,29,30] consist of sums of scores ξs and ξn satisfying the conditions of Theorem 2.3, showing that the statistics in these papers enjoy rates of normal convergence (in the Kolmogorov distance) given by the reciprocal of the standard deviation of Hs and Hn, respectively. In the case that X is an mdimensional C1-submanifold of Rd , with d the Euclidean distance in Rd , the directed nearest neighbors graph version of Theorem 3.1 [cf Remark (iii) above] may be refined to give rates of normal convergence for statistics of highdimensional nonlinear data sets The breakthrough paper [34], which relies on dependency graph methods and Voronoi cells, establishes rates of normal convergence for Poisson input and h ∈ {f0, . . . , fd−1, Vd } of the order s−

Theorems and
This implies
Var X Var X
Combining the identity
For m
Similar arguments show for the binomial case that sup P
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call