Abstract

AbstractThe Biosphere‐Atmosphere Transfer Scheme (BATS) ground snow albedo algorithm is commonly used in land‐surface models (LSM), weather forecasting and research applications. This study addresses key uncertainties in BATS simulated ground snow albedo within the Noah‐MP LSM framework through evaluation and optimization of the Noah‐MP BATS ground snow albedo formulation using 2‐band (visible and near‐infrared (NIR)) in situ albedo observations at Rocky Mountain field stations. The Noah‐MP BATS ground snow albedo scheme is extremely sensitive to its input parameters. Namely, an ensemble generated by varying BATS input parameters within potentially plausible ranges provides an average daily range (maximum ensemble member minus minimum ensemble member) of ground snow albedo exceeding 0.45 in visible and NIR bands. Parameter optimization improves agreement between simulated and in situ observed ground snow albedo in visible, NIR and broadband spectrums. Importantly, optimized parameters result in reduced biases relative to observed fresh‐snow albedo and better agreement with observed albedo decay. Our analysis across different sites supports that the optimized BATS ground snow albedo parameters are appropriate to transfer in space and time, at least within the region studied (the central‐southern Rocky Mountains). The primary error source remaining after parameter optimization is that observed fresh‐snow albedo is highly variable, particularly in the NIR spectrum, whereas BATS fresh‐snow albedo is constant, an issue which requires further investigation. This study shows significant correlations between observed fresh‐snow albedo and surface meteorological conditions (e.g., downward shortwave radiation and temperature) which can support future model development that attempts to include a time‐varying formulation for fresh‐snow albedo.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call