Abstract

Signal-to-Noise Ratio (SNR) is the benchmark to evaluate the quality of optical remote sensors. For SNR estimation, most of the traditional methods have complicated processes, low efficiency, and general accuracy. In particular, they are not suitable for the distributed computation on intelligent satellites. Therefore, an intelligent SNR estimation algorithm with strong computing power and more accuracy is urgently needed. Considering the simplicity of distributed deployment and the lightweight goal, our first proposition is to design a convolutional neural network (CNN) similar to VGG (proposed by Visual Geometry Group) to estimate SNR for optical remote sensors. In addition, considering the advantages of multi-branch structures, the second proposition is to train the CNN in a novel method of multi-branch training and parameter-reconstructed inference. In this study, simulated and real remote sensing images with different ground features are utilized to validate the effectiveness of our model and the novel training method. The experimental results show that the novel training method enhances the fitting ability of the network, and the proposed CNN trained in this method has high accuracy and reliable SNR estimation, which achieves a 3.9% RMSE for noise-level-known simulated images. When compared to the accuracy of the reference methods, such as the traditional and typical SNR methods and the denoising convolutional neural network (DnCNN), the performance of the proposed CNN trained in a novel method is the best, which achieves a relative error of 5.5% for hyperspectral images. The study is fit for optical remote sensing images with complicated ground surfaces and different noise levels captured by different optical remote sensors.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.