Abstract

Most of the methods nowadays employed in forecast problems are based on scoring rules. There is a divergence function associated to each scoring rule, that can be used as a measure of discrepancy between probability distributions. This approach is commonly used in the literature for comparing two competing predictive distributions on the basis of their relative expected divergence from the true distribution. In this paper we focus on the use of scoring rules as a tool for finding predictive distributions for an unknown of interest. The proposed predictive distributions are asymptotic modifications of the estimative solutions, obtained by minimizing the expected divergence related to a general scoring rule. The asymptotic properties of such predictive distributions are strictly related to the geometry induced by the considered divergence on a regular parametric model. In particular, the existence of a global optimal predictive distribution is guaranteed for invariant divergences, whose local behaviour is similar to well known $\alpha $-divergences. We show that a wide class of divergences obtained from weighted scoring rules share invariance properties with $\alpha $-divergences. For weighted scoring rules it is thus possible to obtain a global solution to the prediction problem. Unfortunately, the divergences associated to many widely used scoring rules are not invariant. Still for these cases we provide a locally optimal predictive distribution, within a specified parametric model.

Highlights

  • Recent years have seen growing interest in the use of scoring rules for statistical estimation and prediction

  • We show that a wide class of divergences obtained from weighted scoring rules share invariance properties with α-divergences

  • In this work we propose a wider and more complete use of scoring rules for prediction, that goes beyond the simple comparison of two competing predictive distributions

Read more

Summary

Introduction

Recent years have seen growing interest in the use of scoring rules for statistical estimation and prediction. The class of monotone and regular divergences, introduced in [8] and studied in [9] as a wide class of invariant divergences, leads to asymptotically optimal predictive distributions. Borrowing from [11], we consider the existence of predictive distributions that asymptotically minimize score and weighted score divergences. To this aim, we first study the local behaviour of score and weighted score divergences up to third order. For quasi-Bregman weighted divergences, the optimal predictive distribution does not depend on the considered parametric model, constituting a complete and global solution to the prediction problem.

Basic concepts in differential geometry of statistical models
The geometry of divergences
Monotone and regular divergences
Examples
Score divergences
The geometry of score divergences
Weighted score divergences
The geometry of quasi-Bregman weighted score divergences
The prediction problem
A locally optimal predictive distribution
A global solution to the problem of prediction
The normal model with unknown mean
The autoregressive model
The normal non-linear model
The normal model with unknown mean and variance
The exponential model
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.