Abstract

The minimum distance approach for reconstructing a positive function based on knowledge of finitely many linear functional values is examined. Two important classes of directed distances for signal processing and statistical inference are discussed. By imposing conditions analogous to those satisfied by linear projections in Hilbert space, two logarithmic entropy principles are derived. One of these involves the Itakura–Saito distortion measure of communication theory and uniquely extends Burg's maximum entropy method to incorporate prior knowledge. The other uses the Kullback–Leibler distance of statistics.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call