Abstract
Methane is one of the most important anthropogenic greenhouse gases with a significant impact on the Earth's radiation budget and tropospheric background ozone. Despite a well-constrained global budget, quantification of local and regional methane emissions has proven challenging. Recent advancements in airborne remote sensing instruments such as from the next-generation Airborne Visible/Infrared Imaging Spectrometer (AVIRIS-NG) provide 2-D observations of CH4 plume column enhancements at an unprecedented resolution of 1–5 m over large geographic areas. Quantifying an emission rate from observed plumes is a critical step for understanding local emission distributions and prioritizing mitigation efforts. However, there exists no method that can predict emission rates from detected plumes in real-time without ancillary data reliably. In order to predict methane point-source emissions directly from high resolution 2-D plume images without relying on other local measurements such as background wind speeds, we trained a convolutional neural network model called MethaNet. The training data was derived from large eddy simulations of methane plumes and realistic measurement noise over agricultural, desert and urban environments. Our model has a mean absolute percentage error for predicting unseen plumes under 17%, a significant improvement from previous methods that require wind information. Using MethaNet, a validation against a natural gas controlled-release experiment agrees to within the precision error estimate. Our results support the basis for the applicability of using deep learning techniques to quantify CH4 point sources in an automated manner over large geographical areas, not only for present and future airborne field campaigns but also for upcoming space-based observations in this decade.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.