Abstract

We show how modern Bayesian Machine Learning tools can be effectively used in order to develop efficient methods for filtering Earth Observation signals. Bayesian statistical methods can be thought of as a generalization of the classical least-squares adjustment methods where both the unknown signals and the parameters are endowed with probability distributions, the priors. Statistical inference under this scheme is the derivation of posterior distributions, that is, distributions of the unknowns after the model has seen the data. Least squares can then be thought of as a special case that uses Gaussian likelihoods, or error statistics. In principle, for most non-trivial models, this framework requires performing integration in high-dimensional spaces. Variational methods are effective tools for approximate inference in Statistical Machine Learning and Computational Statistics. In this paper, after introducing the general variational Bayesian learning method, we apply it to the modelling and implementation of sparse mixtures of Gaussians (SMoG) models, intended to be used as adaptive priors for the efficient representation of sparse signals in applications such as wavelet-type analysis. Wavelet decomposition methods have been very successful in denoising real-world, non-stationary signals that may also contain discontinuities. For this purpose we construct a constrained hierarchical Bayesian model capturing the salient characteristics of such sets of decomposition coefficients. We express our model as a Dirichlet mixture model. We then show how variational ideas can be used to derive efficient methods for bypassing the need for integration: the task of integration becomes one of optimization. We apply our SMoG implementation to the problem of denoising of Synthetic Aperture Radar images, inherently affected by speckle noise, and show that it achieves improved performance compared to established methods, both in terms of speckle reduction and image feature preservation.

Highlights

  • For that purpose we propose to use the sparse mixture of Gaussians (SM O G) model as a flexible prior on the decomposition coefficients, learnt under the Bayesian framework

  • We focus on the problem of speckle reduction in Synthetic Aperture Radar imagery, an increasingly important, high-resolution

  • Where ε = x − xis the error committed by the estimator at hand, leading to the mean squared error (MSE), where MSE = N1 ∑nN=1 ε n 2

Read more

Summary

Introduction

Images, and data have found widespread use across a wide array of scientific disciplines and practical applications. The idea in sparse representation is to describe the original signals using such ‘atomic decompositions’, with the coefficients of the decomposition serving as the resultant ‘code’. This code is almost always sparse for a wide variety of signals, meaning that most coefficients will be almost zero with only a small percentage of them being significantly larger than zero. For that purpose we propose to use the sparse mixture of Gaussians (SM O G) model as a flexible prior on the decomposition coefficients, learnt under the Bayesian framework. Olshausen and Millman [1] have used this prior for learning sparse codes of natural images under a combined Maximum Likelihood/Markov

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.