Abstract

Reliable estimates of poplar plantations area are not available at the French national scale due to the unsuitability and low update rate of existing forest databases for this short-rotation species. While supervised classification methods have been shown to be highly accurate in mapping forest cover from remotely sensed images, their performance depends to a great extent on the labelled samples used to build the models. In addition to their high acquisition cost, such samples are often scarce and not fully representative of the variability in class distributions. Consequently, when classification models are applied to large areas with high intra-class variance, they generally yield poor accuracies because of data shift issues. In this paper, we propose the use of active learning to efficiently adapt a classifier trained on a source image to spatially distinct target images with minimal labelling effort and without sacrificing the classification performance. The adaptation consists in actively adding to the initial local model new relevant training samples from other areas in a cascade that iteratively improves the generalisation capabilities of the classifier leading to a global model tailored to these different areas. This active selection relies on uncertainty sampling to directly focus on the most informative pixels for which the algorithm is the least certain of their class labels. Experiments conducted on Sentinel-2 time series revealed their high capacity to identify poplar plantations at a local scale with an average F-score ranging from 89.5% to 99.3%. For large area adaptation, the results showed that when the same number of training samples was used, active learning outperformed random sampling by up to 5% of the overall accuracy and up to 12% of the class F-score. Additionally, and depending on the class considered, the random sampling model required up to 50% more samples to achieve the same performance of an active learning-based model. Moreover, the results demonstrate the suitability of the derived global model to accurately map poplar plantations among other tree species with overall accuracy values up to 14% higher than those obtained with local models. The proposed approach paves the way for a national scale mapping in an operational context.

Highlights

  • Poplar (Poplus spp.) is one of the fast-growing and wood producing trees which are increasingly considered as an important resource to meet the global demand for natural forest products

  • The results revealed a high capacity of S2 to identify poplar plantations with an average F-score ranging from 89.5% to 99.3%

  • We propose the use of an active learning approach for the classification of poplar plantations, among other tree species, in a large-scale context

Read more

Summary

Introduction

Poplar (Poplus spp.) is one of the fast-growing and wood producing trees which are increasingly considered as an important resource to meet the global demand for natural forest products. For the past 20 years, the poplar sector has undergone several economic, social and environmental upheavals and have had an impact on the planting rate [3]. This deficit of several years has led the sector to an unavoidable wood shortage, which is expected to reach at least 500,000 m3/year in 2025, according to the CNP. Considering this risky situation, national strategies have been undertaken to encourage all the industry stakeholders, including providing financial incentives to replant poplar

Objectives
Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.