Abstract
CIRAD's TETIS research unit is developing an automated mapping method based on the Moringa chain that minimizes interactions with users by automating most image analysis and processing. The methodology uses jointly a Very High Spatial Resolution image (Spot6/7 or Pleiades) and one or more time series of High Spatial Resolution optical images such as Sentinel-2 and Landsat-8 for a classification combining segmentation and object classification (use of the Random Forest algorithm) driven by a learning database constituted from in situ collection and photo-interpretation. The land use maps are produced as part of the GABIR project (Gestion Agricole des Biomasses a l'echelle de l'Ile de la Reunion) and are downloadable below or on CIRAD's spatial data catalogue in Reunion: http://aware.cirad.fr/ This Dataverse entry concerns the maps produced, for the year 2018, using a mosaic of Pleiades images to calculate segmentation (extraction of homogeneous objects from the image). We use a field database with a nested nomenclature with 3 levels of accuracy allowing us to produce a classification by level. The most detailed level 3 distinguishing crop types has an overall accuracy of 87% and a Kappa index of 0.85. Level 2, distinguishing crop groups, has an overall accuracy of 92% and a Kappa index of 0.90. Level 1, distinguishing major land use groups, has an overall accuracy of 97% and a Kappa index of 0.95. A detailed sheet presenting the validation method and results is available for download.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.