Biophysical changes in the Arctic and boreal zones drive shifts in vegetation, such as increasing shrub cover from warming soil or loss of living mat species due to fire. Understanding current and future responses to these factors requires mapping vegetation at a fine taxonomic resolution and landscape scale. Plants vary in size and spectral signatures, which hampers mapping of meaningful functional groups at coarse spatial resolution. Fine spatial grain of remotely sensed data (<10 cm pixels) is often necessary to resolve patches of many Arctic and boreal plant groups, such as bryophytes and lichens, which are significant components of terrestrial vegetation cover. Separation of co-occurring small vegetation patches in images also requires high spectral resolution. Our goal here was to test the capabilities of UAS-based imaging spectroscopy for mapping plant functional types (PFT) using high spatial and spectral resolution data over Arctic and boreal vegetation at four sites in central Alaska. We then tested several Machine and Deep learning models of PFT cover using the reflectance spectra. The best models were very simple, balancing both bias (overfitting caused by imbalance sample sizes) and variance (fit to the independent validation data), explaining > 50 % of the independent ground cover estimation and > 84 % accuracy in estimating validation pixels. We explored the impact of spectral resolution on PFT mapping by including vegetation indices and a gradient of narrow (5 nm) to wide (50 nm) band features in our classification models across. Vegetation indices were the most important predictors for classifying PFTs, while including band features improved models, with narrow and wide bandwidths having similar importance but models with wide bandwidths performing slightly better. We conclude that Arctic and boreal PFT reflectance can be pooled across sites for mapping with relatively few labeled pixels. Underfit, simple algorithms outperformed deep learning, at least with these small sample sizes, in classifying PFTs by balancing bias and variance. Future work should aim to increase the number of labeled pixels and the detail of labels to further improve mapping taxonomic precision.
Read full abstract