Abstract
The maximum entropy (MaxEnt) method offers a powerful means of estimating the probability density functions (pdf’s) of measured data. Traditionally, pdf’s have been estimated by fitting data to particular distributions (e.g., Gaussian, log-normal, Rayleigh, chi-squared, etc.). This approach can lead to misleading results, especially if the data are highly skewed or kurtotic. First published in 1957 by Jaynes [E. T. Jaynes, Phys. Rev., 106, 620–630 (1957)], MaxEnt has gained wide use in recent years. The method uses Lagrangian multipliers to maximize the entropy, a measure of uncertainty, subject to specified constraints. The result is a pdf that Jaynes describes as being the ‘‘least biased estimate possible on the given information; i.e., it is maximally noncommittal with regard to missing information.’’ In other words, all possibilities under the given constraints are considered equally. This method is particularly useful in determining prior pdf’s based on limited measurements. MaxEnt pdf’s, which belong in the exponential class of distributions, can also be used to derive optimum processors in the maximum likelihood sense. This talk will discuss the theory and implementation of the MaxEnt method and emphasize its utility for modeling acoustic propagation in uncertain environments. [Work supported by The Undersea Signal Processing Program, Office of Naval Research.]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.