Abstract

Cloud cover is inevitable in optical remote sensing (RS) imagery on account of the influence of observation conditions, which limits the availability of RS data. Therefore, it is of great significance to be able to reconstruct the cloud-contaminated ground information. This paper presents a sparse dictionary learning-based image inpainting method for adaptively recovering the missing information corrupted by thick clouds patch-by-patch. A feature dictionary was learned from exemplars in the cloud-free regions, which was later utilized to infer the missing patches via sparse representation. To maintain the coherence of structures, structure sparsity was brought in to encourage first filling-in of missing patches on image structures. The optimization model of patch inpainting was formulated under the adaptive neighborhood-consistency constraint, which was solved by a modified orthogonal matching pursuit (OMP) algorithm. In light of these ideas, the thick-cloud removal scheme was designed and applied to images with simulated and true clouds. Comparisons and experiments show that our method can not only keep structures and textures consistent with the surrounding ground information, but also yield rare smoothing effect and block effect, which is more suitable for the removal of clouds from high-spatial resolution RS imagery with salient structures and abundant textured features.

Highlights

  • During the past decades, remote sensing (RS) images have been commonly adopted in many applications, like scene interpretation, land-use classification, land-cover change monitoring, and atmospheric environment surveying

  • The illustration of our adaptive patch inpainting approach based on sparse dictionary learning is shown in Figure 1a, where feature dictionary D is learned from exemplars in the source region Ω to estimate the missing patch Ψ p via sparse representation, and neighboring patches {Ψ p j }

  • This sectionscheme is devoted to the clouds experimental analysis andRSdiscussion of our adaptive patch (e.g., texture technique, morphological component analysis (MCA) [27,46], Markov random field (MRF) [35])methods and inpainting schemesynthesis for removing clouds from high resolution

Read more

Summary

Introduction

Remote sensing (RS) images have been commonly adopted in many applications, like scene interpretation, land-use classification, land-cover change monitoring, and atmospheric environment surveying. The cloud removal approaches of spatial-complementation category are mainly developed based on the image inpainting technique, which utilize the known ground information in the cloud-free regions to infer the cloudy parts. Sparse representation has been gradually deployed into the image restoration field and proven to be appropriate for recovering large-area missing information recently [38] Inspired by this idea and the latest progress on exemplar-based image inpainting, we present a dictionary-learning based adaptive inpainting approach via patch propagation. Through simulated and real experiments on thick clouds removal from high-spatial resolution RS images, the proposed method exhibits a superior performance over that of some existing mainstream approaches, which can well preserve the continuity of filled structures and the consistency of synthesized textures, yielding rare smoothing effect and edge effect.

Exemplar-Based Image Inpainting
Sparse Dictionary Learning
Adaptive Patch Inpainting Model and Algorithm
Patch Priority
The Optimization Model for Patch Inpainting
Modified OMP and the Inpainting Algorithm
Thick Clouds Removal Scheme for RS Imagery
Experiments and Discussion
RESULT
Result of
Result
Comparisons of Statistical
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call