Abstract

Detecting early deforestation is a fundamental process in reducing forest degradation and carbon emissions. With this procedure, it is possible to monitor and control illegal activities associated with deforestation. Most regular monitoring projects have been recently proposed, but most of them rely on optical imagery. In addition, these data are seriously restricted by cloud coverage, especially in tropical environments. In this regard, Synthetic Aperture Radar (SAR) is an attractive alternative that can fill this observational gap. This work evaluated and compared a conventional method based on time series and a Fully Convolutional Network (FCN) with bi-temporal SAR images. These approaches were assessed in two regions of the Brazilian Amazon to detect deforestation between 2019 and 2020. Different pre-processing techniques, including filtering and stabilization stages, were applied to the C-band Sentinel-1 images. Furthermore, this study proposes to provide the network with the distance map to past-deforestation as additional information to the pair of images being compared. In our experiments, this proposal brought up to 4% improvement in average precision. The experimental results further indicated a clear superiority of the DL approach over a time series-based deforestation detection method used as a baseline in all experiments. Finally, the study proved the benefits of pre-processing techniques when using detection methods based on time series. On the contrary, the analysis revealed that the neural network could eliminate noise from the input images, making filtering innocuous and, therefore, unnecessary. On the other hand, the stabilization of the input images brought non-negligible accuracy gains to the DL approach.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.