Abstract

Imaging is not itself a system goal, but is rather a means to support inference tasks. For data processing with linearized signal models, we seek to report all high-probability interpretations of the data and to report confidence labels in the form of posterior probabilities. A low-complexity recursive procedure is presented for Bayesian estimation in linear regression models. A Gaussian mixture is chosen as the prior on the unknown parameter vector. The algorithm returns both a set of high posterior probability mixing parameters and an approximate minimum mean squared error (MMSE) estimate of the parameter vector. Emphasis is given to the case of a sparse parameter vector. Numerical simulations demonstrate estimation performance and illustrate the distinctions between MMSE estimation and maximum a posteriori probability (MAP) model selection. The proposed tree-search algorithm provides exact ratios of posterior probabilities for a set of high probability solutions to the sparse reconstruction problem. These relative probabilities serve to reveal potential ambiguity among multiple candidate solutions that are ambiguous due to low signal-to-noise ratio and/or significant correlation among columns in the super-resolving regressor matrix.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.