Abstract

The information loss in deterministic, memoryless systems is investigated by evaluating the conditional entropy of the input random variable given the output random variable. It is shown that for a large class of systems the information loss is finite, even if the input has a continuous distribution. For systems with infinite information loss, a relative measure is defined and shown to be related to Renyi information dimension. As deterministic signal processing can only destroy information, it is important to know how this information loss affects the solution of inverse problems. Hence, we connect the probability of perfectly reconstructing the input to the information lost in the system via Fano-type bounds. The theoretical results are illustrated by example systems commonly used in discrete-time, nonlinear signal processing and communications.

Highlights

  • When opening a textbook on linear [1] or nonlinear [2] deterministic signal processing, input-output systems are typically characterized—aside from the difference or differential equation defining the system—by energy- or power-related concepts: L2 or energy/power gain, passivity, losslessness, input-output stability, and transfer functions are all defined using the amplitudes of the involved signals, essentially energetic in nature

  • What these books are currently lacking is an information-theoretic characterization of signal processing systems, despite the fact that such a characterization is strongly suggested by the data processing inequality [5] (Corollary 7.16): We know that the information content of a signal cannot increase by deterministic processing, just as, loosely speaking, a passive system cannot increase the energy contained in a signal

  • While we have a definition of the energy loss, an analysis of the information loss in a system has not been presented yet. This gap is even more surprising since important connections between information-theoretic quantities and signal processing performance measures have long been known: The mutual information between a random variable and its noisy observation is connected to the minimum mean squared reconstruction error (MSRE) [6], and mutual information presents a tight bound on the gain for nonlinear prediction [7]

Read more

Summary

Introduction

When opening a textbook on linear [1] or nonlinear [2] deterministic signal processing, input-output systems are typically characterized—aside from the difference or differential equation defining the system—by energy- or power-related concepts: L2 or energy/power gain, passivity, losslessness, input-output stability, and transfer functions are all defined using the amplitudes (or amplitude functions like the squared amplitude) of the involved signals, essentially energetic in nature. This gap is even more surprising since important connections between information-theoretic quantities and signal processing performance measures have long been known: The mutual information between a random variable and its noisy observation is connected to the minimum mean squared reconstruction error (MSRE) [6], and mutual information presents a tight bound on the gain for nonlinear prediction [7] It is the purpose of this work to close this gap and to propose information loss as a general system characteristic, complementing the prevailing energy-centered descriptions. The Fano-type inequalities between the relative and absolute measures of information loss and the probability of a reconstruction error are altogether new

Related Work
Notation and Preliminaries
Information Loss
Relative Information Loss
Interplay between Information Loss and Relative Information Loss
Information Loss for Piecewise Bijective Functions
Information Loss in PBFs
Upper Bounds on the Information Loss
Reconstruction and Reconstruction Error Probability
Information Loss for Systems that Reduce Dimensionality
Relative Information Loss for Continuous Input RVs
Some Examples from Signal Processing and Communications
Quantizer
Center Clipper
Adding Two RVs
Square-Law Device and Gaussian Input
Polynomials
Energy Detection of Communication Signals
Principal Components Analysis and Dimensionality Reduction
Findings
Discussion and Outlook
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call