Abstract

Two statistical models including partial least squares regression (PLSR) and principal component regression were comparatively utilized to determine the predictive accuracy of visible–near-infrared and short-wave infrared reflectance spectroscopy in quantifying the Fe concentration in contaminated soils. Two scenarios were applied to select the best model: Scenario I included all wavelengths (400–2450 nm) and Scenario II encompassed characteristic bands of Fe. Pre-processing techniques used to select the best model included: first and second derivatives (FD and SD), multiplicative scatter correction (MSC) and standard normal variate. The abilities of the predictive models were evaluated by splitting soil samples into two random groups (80 and 20%). The first group (80%) was used to evaluate calibration and validation sets by employing the cross‐validation method, and the second group (20%) was applied to test the models. The coefficient of determination (R 2), root mean square error and residual prediction deviation were calculated to evaluate the models. Applying Scenario I indicated that the PLSR model with SD pre-processing was a more accurate technique for predicting the Fe concentration, whereas in the Scenario II, the PLSR model with MSC pre-processing had a better performance. Comparing Scenarios I and II indicated that the more reliable models for predicting the soil Fe content could be constructed by the PLSR model with the SD pre-processing techniques and all wavelengths. The modeling results produced by the PLSR model with the SD pre-processing could be used to detect, map and monitor Fe-contaminated soils by proximal and remote sensing in the mining areas.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call