Abstract

This article addresses two important themes in early visual computation: it presents a novel theory for learning the universal statistics of natural images, and, it proposes a general framework of designing reaction-diffusion equations for image processing. We studied the statistics of natural images including the scale invariant properties, then generic prior models were learned to duplicate the observed statistics, based on minimax entropy theory. The resulting Gibbs distributions have potentials of the form U(I; /spl Lambda/, S)=/spl Sigma//sub /spl alpha/=1//sup k//spl Sigma//sub x,y//spl lambda//sup (/spl alpha/)/((F/sup (/spl alpha/)/*I)(x,y)) with S={F/sup (1)/, F/sup (2)/,...,F/sup (K)/} being a set of filters and /spl Lambda/={/spl lambda//sup (1)/(),/spl lambda//sup (2)/(),...,/spl lambda//sup (K)/()} the potential functions. The learned Gibbs distributions confirm and improve the form of existing prior models such as line-process, but, in contrast to all previous models, inverted potentials were found to be necessary. We find that the partial differential equations given by gradient descent on U(I; /spl Lambda/, S) are essentially reaction-diffusion equations, where the usual energy terms produce anisotropic diffusion, while the inverted energy terms produce reaction associated with pattern formation, enhancing preferred image features. We illustrate how these models can be used for texture pattern rendering, denoising, image enhancement, and clutter removal by careful choice of both prior and data models of this type, incorporating the appropriate features.

Highlights

  • I N computer vision, many generic prior models have been proposed to capture the universal low level statistics of natural images

  • We find that the partial differential equations given by gradient descent on U(I; L, S) are essentially reaction-diffusion equations, where the usual energy terms produce anisotropic diffusion, while the inverted energy terms produce reaction associated with pattern formation, enhancing preferred image features

  • We find that the partial differential equations given by gradient descent on U(I; L, S) are essentially reactiondiffusion equations, which we call the Gibbs Reaction and Diffusion Equations (Grade)

Read more

Summary

INTRODUCTION

I N computer vision, many generic prior models have been proposed to capture the universal low level statistics of natural images. Mumford is with the Division of Applied Mathematics, Brown University, Providence, RI 02912 These prior models have been motivated by regularization theory [26], [18],1 physical modeling [31], [4],2 Bayesian theory [9], [20], and robust statistics [19], [13], [3]. We illustrate how the learned models can be used for denoising, image enhancement, and clutter removal by careful choice of both prior and noise models of this type, incorporating the appropriate features extracted at various scales and orientations.

Goal of Prior Learning and Two Extreme Cases
M a f M
Learning Prior Models by Minimax Entropy
Statistic of Natural Images
Experiment I
Remark 2
Remark 1
From Gibbs Distribution to Reaction-Diffusion Equations
J KJ a
Anisotropic Diffusion and Gibbs Reaction-Diffusion
I x KJJ y
Gibbs Reaction-Diffusion for Pattern Formation
IMAGE ENHANCEMENT AND CLUTTER REMOVAL
Experiment II
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.