Abstract

Orthophotos are often used to establish forest maps by visual interpretation and manual stand delineation. A classification method of forest stands using digitized colour orthophotos with very high spatial resolution (0.80 m) is proposed. It combines digital vector and raster data in a per-parcel classification approach. The classification variables are reflectances in the red, green, and blue bands, and six texture indices derived from the grey level co-occurrence matrix (GLCM). The effect of the calibration sample size and the number of variables on the classification accuracy was investigated. Calibration and validation sets were formed by respectively 220 and 219 parcels. The maximum global accuracy (79%) is achieved by using the six variables that contribute most to the discriminatory power of the model. These variables were reflectances in the red, green, and blue bands and three texture indices: contrast, correlation, and variance. According to the discriminatory power analysis, the global accuracy should be improved by including in the model the three remaining variables if the calibration set were large enough. Nine forest stand types based on species composition and class height were identified. The identified species are common spruce, fir, larch, oak, and beech. Three classes were discriminated in common spruce stands. In terms of the producer's accuracy, more than 80% of the area was classified with an accuracy higher than 75%. In terms of user's accuracy, more than 70% of the area was classified with an accuracy higher than 80%. These results show the importance of texture consideration in the analysis of very high spatial resolution imagery in forest remote sensing.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.