Abstract
A series of miscible-displacement experiments was conducted to examine the impact of experiment conditions (detection limit, input-pulse size, input concentration, pore-water velocity, contact time) on the performance of a mathematical solute-transport model incorporating nonlinear, rate-limited sorption/desorption described by a continuous-distribution reaction function. Effluent solute concentrations were monitored over a range of approximately seven orders of magnitude, allowing characterization of asymptotic tailing phenomenon. The model successfully simulated the extensive elution tailing observed for the measured data. Values for the mean desorption rate coefficient (ln k2) and the variance of ln k2 were obtained through calibration of the model to measured data. Similar parameter values were obtained for experiments with different input-pulse size, input concentration, pore-water velocity, contact time. This suggests that the model provided a robust representation of sorption-desorption for this system tested. The impact of analytical detection limit was examined by calibrating the model to subsets of the breakthrough curves wherein the extent of the elution tail was artificially reduced to mimic a poorer detection limit. The parameters varied as a function of the extent of elution tail used for the calibrations, indicating the importance of measuring as full an extent of the tail as possible.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have