Abstract

We consider a random interval splitting process, in which the splitting rule depends on the empirical distribution of interval lengths. We show that this empirical distribution converges to a limit almost surely as the number of intervals goes to infinity. We give a characterization of this limit as a solution of an ODE and use this to derive precise tail estimates. The convergence is established by showing that the size-biased empirical distribution evolves in the limit according to a certain deterministic evolution equation. Although this equation involves a non-local, non-linear operator, it can be studied thanks to a carefully chosen norm with respect to which this operator is contractive. In finite-dimensional settings, convergence results like this usually go under the name of stochastic approximation and can be approached by a general method of Kushner and Clark. An important technical contribution of this article is the extension of this method to an infinite-dimensional setting.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.