Abstract

Image enhancement and edge-preserving denoising are relevant steps before classification or other postprocessing techniques for remote sensing images. However, multisensor array systems are able to simultaneously capture several low-resolution images from the same area on different wavelengths, forming a high spatial/spectral resolution image and raising a series of new challenges. In this paper, an open computing language based parallel implementation approach is presented for near real-time enhancement based on Bayesian maximum entropy (BME), as well as an edge-preserving denoising algorithm for remote sensing imagery, which uses the local linear Stein's unbiased risk estimate (LLSURE). BME was selected for its results on synthetic aperture radar image enhancement, whereas LLSURE has shown better noise removal properties than other commonly used methods. Within this context, image processing methods are algorithmically adapted via parallel computing techniques and efficiently implemented using CPUs and commodity graphics processing units (GPUs). Experimental results demonstrate the reduction of computational load of real-world image processing for near real-time GPU adapted implementation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call