Abstract

Recently, a series of divergence measures have emerged from information theory and statistics and numerous inequalities have been established among them. However, none of them are a metric in topology. In this paper, we propose a class of metric divergence measures, namely, , and study their mathematical properties. We then study an important divergence measure widely used in credit scoring, called information value. In particular, we explore the mathematical reasoning of weight of evidence and suggest a better alternative to weight of evidence. Finally, we propose using as alternatives to information value to overcome its disadvantages.

Highlights

  • We propose a class of metric divergence measures, namely, Lp(P ‖ Q), P ≥ 1, and study their mathematical properties

  • We study an important divergence measure widely used in credit scoring, called information value

  • The information measure is an important concept in Information theory and statistics

Read more

Summary

Introduction

The information measure is an important concept in Information theory and statistics. For all P, Q ∈ Δ n, the following divergence measures are well known in the literature of information theory and statistics. Cressie and Read [13] considered the one-parametric generalization of information measure D(P ‖ Q), called the relative information of type s given by n Taneja proved [14] that all the 3 s-type information measures Ds(P ‖ Q), Vs(P ‖ Q), and Ws(P ‖ Q) are nonnegative and convex in the pair (P, Q). He obtained inequalities regarding the various divergence measures:.

Review of Metric Space
Nonmetric Divergence Measures
Information Value in Credit Scoring
Numerical Results
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call