Abstract

We compute exact values respectively bounds of dissimilarity/distinguishability measures–in the sense of the Kullback-Leibler information distance (relative entropy) and some transforms of more general power divergences and Renyi divergences–between two competing discrete-time Galton-Watson branching processes with immigration GWI for which the offspring as well as the immigration (importation) is arbitrarily Poisson-distributed; especially, we allow for arbitrary type of extinction-concerning criticality and thus for non-stationarity. We apply this to optimal decision making in the context of the spread of potentially pandemic infectious diseases (such as e.g., the current COVID-19 pandemic), e.g., covering different levels of dangerousness and different kinds of intervention/mitigation strategies. Asymptotic distinguishability behaviour and diffusion limits are investigated, too.

Highlights

  • Because of the involved Poisson distributions, these goals can be tackled with a high degree of tractability, which is worked out in detail with the following structure: in Section 2, we first introduce (i) the basic ingredients of Galton-Watson processes together with their interpretations in the above-mentioned pandemic setup where it is essential to study all types of criticality, (ii) the employed fundamental information measures such as Hellinger integrals, power divergences and Renyi divergences, (iii) the underlying decision-making framework, as well as (iv) connections to time series of counts and asymptotical distinguishability

  • In terms of our notations (PS1) to (PS3), a typical situation for applications in our mind is that one particular constellation ∈ P is fixed, whereas–in contrast–the parameter λ ∈ R\{0, 1} for the Hellinger integral or the power divergence might be chosen freely, e.g., depending on which dissimilarity measure one decides to choose for further analysis

  • Λ which–depending on the parameter constellation ∈ (PSP\PSP,1)×]0, 1[–may or may not lead to upper bounds BλU,X0,n which are consistent with Goal (G1) or with (G2)

Read more

Summary

Introduction

(This paper is a thoroughly revised, extended and retitled version of the preprint arXiv:1005.3758v1 of both authors) Over the past twenty years, density-based divergences D(P, Q) – known as (dis)similarity measures, directed distances, disparities, distinguishability measures, proximity measures–between probability distributions P and Q, have turned out to be of substantial importance for decisive statistical tasks such as parameter estimation, testing for goodness-of-fit, Bayesian decision procedures, change-point detection, clustering, as well as for other research fields such as information theory, artificial intelligence, machine learning, signal processing (including image and speech processing), pattern recognition, econometrics, and statistical physics. Because of the involved Poisson distributions, these goals can be tackled with a high degree of tractability, which is worked out in detail with the following structure (see the full table of contents after this paragraph): in Section 2, we first introduce (i) the basic ingredients of Galton-Watson processes together with their interpretations in the above-mentioned pandemic setup where it is essential to study all types of criticality (being connected with levels of reproduction numbers), (ii) the employed fundamental information measures such as Hellinger integrals, power divergences and Renyi divergences, (iii) the underlying decision-making framework, as well as (iv) connections to time series of counts and asymptotical distinguishability. Explicit closed-form bounds of Hellinger integrals Hλ (PA||PH) will be worked out in Section 6, whereas Section 7 deals with Hellinger integrals and power divergences of the above-mentioned Galton-Watson type diffusion approximations

Process Setup
Connections to Time Series of Counts
Applicability to Epidemiology
Information Measures
Decision Making under Uncertainty
Asymptotical Distinguishability
A First Basic Result
Some Useful Facts for Deeper Analyses
3.14. Intermezzo 1
3.15.1. Bayesian Decision Making
3.15.2. Neyman-Pearson Testing
Principal Approach
Totally Explicit Closed-Form Bounds
Applications to Decision Making
Branching-Type Diffusion Approximations
Bounds of Hellinger Integrals for Diffusion Approximations
Bounds of Power Divergences for Diffusion Approximations
Λλ and
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call