Abstract

Limitations of statistics currently used to assess balance in observation samples include their insensitivity to shape discrepancies and their dependence upon sample size. The Jensen–Shannon divergence (JSD) is an alternative approach to quantifying the lack of balance among treatment groups that does not have these limitations. The JSD is an information-theoretic statistic derived from relative entropy, with three specific advantages relative to using standardized difference scores. First, it is applicable to cases in which the covariate is categorical or continuous. Second, it generalizes to studies in which there are more than two exposure or treatment groups. Third, it is decomposable, allowing for the identification of specific covariate values, treatment groups or combinations thereof that are responsible for any observed imbalance.

Highlights

  • The goal of comparative studies is to measure the effect of two or more treatment groups on an outcome

  • Randomized clinical trials mitigate this risk through randomization of treatments, resulting in balanced groups with respect to the confounding variables

  • We propose the use of an information-theoretic measure known as the Jensen–Shannon divergence (JSD) [3] to assess treatment group balance

Read more

Summary

Introduction

The goal of comparative studies is to measure the effect of two or more treatment (or exposure) groups on an outcome. A potential source of bias in these studies is the association between the treatment groups and one or more confounding variables. We say that the relationship between treatment T and outcome O is confounded by a covariate C if C is associated with O and T but is not a consequence of T (i.e., not a mediator of the effect of T on O) [1]. The JSD offers several advantages over the aforementioned approaches It is universally defined for binary, multilevel, and continuous distributions (in practice, computation for continuous distributions is facilitated by binning the variables into a number of discrete levels), for any number of treatment.

Information Theory and the JSD
Entropy
Joint and Conditional Entropy
Mutual Information
Relative Entropy
The JSD of Covariate Distributions Across Treatment Groups
Properties of the JSD
Applications
Findings
Summary
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call