Abstract
ABSTRACT Methane is the second most significant greenhouse gas after carbon dioxide. As global warming intensifies, the quantification of methane point sources is becoming increasingly crucial. However, retrieving high-quality methane signals from remote sensing data remains challenging due to various factors, including surface reflectance, atmospheric interference, sensor noise, wind speed, and sensor sensitivity. In real-world scenarios, methane remote sensing quantification often encounters unfavourable conditions that lead to low signal-to-noise ratio (SNR) signals, resulting in reduced quantification accuracy. To address these challenges, we introduce a novel multi-task learning-based approach. Specifically, we incorporate a denoising auxiliary task into the quantification network by introducing an additional denoising branch that recovers clean plume column concentration maps. The network is trained with supervision using both denoising and quantification losses, which enables it to acquire robust feature representations to noise and benefits the quantification of low SNR images. During the inference phase, the denoising branch is removed, resulting in an efficient and robust single quantification network with reduced inferring time, supporting onboard computation. We construct a low SNR database based on the AVIRIS-NG sensor and evaluate the generalization ability of our method. DQNet achieves the highest RMSE, MAPE, and R of 16.478 kg/h, 15.296%, and 95.167% respectively on the synthetic dataset. Across the entire range of SNR, DQNet exhibits a greater relative advantage as SNR decreases. However, our approach is not limited to AVIRIS-NG and can be easily extended to various multispectral and hyperspectral satellites.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.