Abstract
Compressive sensing (CS) technology is introduced into space optical remote sensing image acquisition stage, which could make wireless image sensor network node quickly and accurately obtain images in the case of two constraints of limited battery power and expensive sensor costs. On this basis, in order to further improve the quality of CS image reconstruction, we propose fused features and perceptual loss encoder-decoder residual network (FFPL-EDRNet) for image reconstruction. FFPL-EDRNet consists of a convolution layer and a reconstruction network. We train FFPL-EDRNet end-to-end, thus greatly simplifying the pre-processing and post-processing process and eliminating the block effect of reconstructed images. The reconstruction network is based on residual network, which introduces multi-scale feature extraction, multi-scale feature combination and multi-level feature combination. Feature fusion integrates low-level information with high-level information to reduce reconstruction error. The perceptual loss function based on pretrained InceptionV3 uses the weighted mean square error to define the loss value between the reconstructed image feature and the label image feature, which makes the reconstructed image more semantically similar to label image. In the measurement procedure, we use convolution to achieve block compression measurement, so as to obtain full image measurements. For image reconstruction, we firstly use a deconvolution layer to initially reconstruct the image and then use the residual network to refine the initial reconstructed image. The experimental results show that: in the case of measurement rates (MRs) of 0.25, 0.10, 0.04 and 0.01, the peak signal-to-noise ratio (PSNR) = 27.502, 26.804, 24.593, 21.359 and structural similarity (SSIM) = 0.842, 0.816, 0.720, 0.568 of the reconstructed images obtained by FFPL-EDRNet. Therefore, Our FFPL-EDRNet could enhance the quality of image reconstruction.
Highlights
Space optical remote sensing (SORS) technology has become an indispensable part of obtaining intelligence information
ROBUST TO NOISE In order to show the robustness of FFPL-EDRNet to noise, we study the effect of image reconstruction in the presence of measurement noise
After the introduction of Compressive sensing (CS) technology in the data acquisition phase of SORS images, in order to further improve the quality of SORS image reconstruction, we propose FFPL-EDRNet
Summary
Space optical remote sensing (SORS) technology has become an indispensable part of obtaining intelligence information. This paper proposes FFPL-EDRNet for SORS image CS, which further enhances the reconstruction quality It will lay a reliable foundation for the following image processing and analysis. Our main contributions are as follows: 1) In order to further improve the quality of CS image reconstruction, we propose a novel encoder-decoder residual network for image reconstruction, called FFPL-EDRNet. For CS image reconstruction of SORS images, this model is better than existing algorithms. 2) In order to simplify the image pre-processing and post-processing process and eliminate the block effect of reconstructed images, FFPL-EDRNet connects the measurement layer and the reconstruction network for end-to-end training. 4) In order to make the reconstructed image more semantically similar to label image, FFPL-EDRNet uses the perceptual loss function to improve reconstructed image quality. The deconvolution layer and residual network could accurately reconstruct the compression measurements
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.