Abstract

During the last few years, the remote sensing community has been trying to address the need for global synthesis to support policy makers on issues such as deforestation or global climate change. Several global thematic products have been derived from large datasets of low-resolution remotely sensed data, the latter providing the best trade-off between spatial resolution, temporal resolution and cost. However, a standard procedure for the validation of such products has not been developed yet. This paper proposes a methodology, based on statistical indices derived from the widely used Error Matrix, to deal with the specific issue of the influence of the low spatial resolution of the dataset on the accuracy of the end-product, obtained with hard classification approaches. In order to analyse quantitatively the trade-off between omission and commission errors, we suggest the use of the ‘Pareto Boundary’, a method rooted in economics theory applied to decisions with multiple conflicting objectives. Starting from a high-resolution reference dataset, it is possible to determine the maximum user and producer's accuracy values (i.e. minimum omission and commission errors) that could be attained jointly by a low-resolution map. The method has been developed for the specific case of dichotomic classifications and it has been adopted in the evaluation of burned area maps derived from SPOT-VGT with Landsat ETM+ reference data. The use of the Pareto Boundary can help to understand whether the limited accuracy of a low spatial resolution map is given by poor performance of the classification algorithm or by the low resolution of the remotely sensed data, which had been classified.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call