Abstract

AbstractShannon entropy has long been accepted as a primary basis for assessing the information content of sensor channels used for the remote sensing of atmospheric variables. It is not widely appreciated, however, that Shannon information content (SIC) can be misleading in retrieval problems involving nonlinear mappings between direct observations and retrieved variables and/or non-Gaussian prior and posterior PDFs. The potentially severe shortcomings of SIC are illustrated with simple experiments that reveal, for example, that a measurement can be judged to provide negative information even in cases in which the postretrieval PDF is undeniably improved over an informed prior based on climatology. Following previous authors’ writing mainly in the data assimilation and climate analysis literature, the Kullback–Leibler (KL) divergence, also commonly known as relative entropy, is shown to suffer from fewer obvious defects in this particular context. Yet, even KL divergence is blind to the expected magnitude of errors as typically measured by the error variance or root-mean-square error. Thus, neither information metric can necessarily be counted on to respond in a predictable way to changes in the precision or quality of a retrieved quantity.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call