Abstract

ABSTRACT Detecting forest decline is crucial for effective forest management in arid and semi-arid regions. Remote sensing using satellite image time series is useful for identifying reduced photosynthetic activity caused by defoliation. However, current studies face limitations in detecting forest decline in sparse semi-arid forests. In this study, three Landsat time-series-based approaches were used to distinguish non-declining and declining forest patches in the Zagros forests. The random forest was the most accurate approach, followed by anomaly detection and the Sen’s slope approach, with an overall accuracy of 0.75 (kappa = 0.50), 0.65 (kappa = 0.30), and 0.64 (kappa = 0.30), respectively. The classification results were unaffected by the Landsat acquisition times, indicating that rather, environmental variables may have contributed to the separation of declining and non-declining areas and not the remotely sensed spectral signal of the trees. We conclude that identifying declining forest patches in semi-arid regions using Landsat data is challenging. This difficulty arises from weak vegetation signals caused by limited canopy cover before a bright soil background, which makes it challenging to detect modest degradation signals. Additional environmental variables may be necessary to compensate for these limitations.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.