Abstract

The present paper investigates the update of an empirical probability distribution with the results of a new set of observations. The update reproduces the new observations and interpolates using prior information. The optimal update is obtained by minimizing either the Hellinger distance or the quadratic Bregman divergence. The results obtained by the two methods differ. Updates with information about conditional probabilities are considered as well.

Highlights

  • IntroductionIt is an updating rule used in Radical Probabilism [7]

  • The present work is inspired by the current practices in Information Geometry [1,2,3]

  • The Pythagorean relations derived in the present work make use of the specific properties of the Hellinger distance and of the quadratic Bregman divergence

Read more

Summary

Introduction

It is an updating rule used in Radical Probabilism [7] This expression is obtained when minimizing the Hellinger distance between the prior and the model manifold. The Pythagorean relations derived in the present work make use of the specific properties of the Hellinger distance and of the quadratic Bregman divergence. They show that this statement remains true when the metric distance is replaced by a Bregman divergence It is shown in Theorem 2 below that a proof in a more general context yields a deviating result. The proof of the theorems can be adapted to cover the situation that a subsequent measurement yields information on conditional probabilities

Empirical Data
Squared Hellinger Distance
Bregman Divergence
Updated Probabilities
Update of Conditional Probabilities
The Hellinger Case
The Bregman Case
Example
Summary
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call