Leakage detectability is a key consideration in evaluating the effectiveness of a measurement, monitoring and verification (MMV) plan for a geologic carbon storage (GCS) project. While studies have shown that surface-based, geophysical monitoring methods may be sensitive enough to detect CO2 leakage, these methods are an indirect indicator of leakage. A drawback to relying on direct geochemical monitoring is that by the time a significant leak is confirmed, it may be too late to mitigate or remediate the environmental impacts. In this study, we combine information from geophysical and geochemical monitoring methods to provide an integrated diagnosis of leakage events. The detectability for various monitoring parameters (pH, TDS, alkalinity, Ca, Cl, Na, and pressure) collected from monitoring wells are evaluated by leakage detection probability, using simulated wellbore-leakage/plume-migration events and monitoring scenarios for a hypothetical GCS operation at the Kimberlina site in California, USA. The NUFT code is used to model these events, by coupling wellbore-leakage simulations to 3-dimensional reactive, multi-phase, flow and transport simulations of brine and CO2 leakage-plume migration in aquifers overlying the GCS reservoir. Wellbore leakage in legacy wells located 1.4, 3.4 and 6.8 km from the CO2 injector is evaluated for a range of (1) wellbore bottom-hole pressure and CO2 saturation determined by GCS simulations, (2) regional groundwater gradient in the aquifers, and (3) wellbore permeability. Simulated leakage-induced changes in seven monitoring parameters at different depths are used to calculate the corresponding detection probabilities, based on the background distribution data and selected monitoring-technology detection thresholds. The responses for these monitoring parameters are tested and combined to enhance the overall detectability. The results indicate the leakage signals are more easily detected at shallower depths where buoyant CO2 has migrated and flashed from supercritical to gas phase, causing a large increase in CO2 volume. While the results suggest pH monitoring is more responsive to the simulated leakage events than TDS monitoring at shallower depths, TDS changes may be more readily observed at greater depths. The addition of carbonate alkalinity can confirm CO2 leakage detection and help distinguish CO2 leakage from other contamination sources. Direct pressure change measurements are very sensitive to leakage, but the rapid and broad propagation of the pressure response hinders using such measurements to easily locate the origin of leakage. Combining measurements could greatly improve the confidence of leakage diagnosis. High bottom-hole CO2 saturation, such as that for a leaky well close to the injector (˜1.4 km), high regional groundwater gradient, and high wellbore permeabilities all increase the leakage plume size, and thus the leakage detectability. Our analysis suggests pressure monitoring is a valuable indicator of leakage events at early stages, while pH, TDS and carbonate alkalinity monitoring can directly diagnose leakage impacts by providing more detailed information in the groundwater receptor. Finally, an example Bayesian belief network model is presented for evaluating the effect of risk reduction options in terms of joint monitoring detection probability using the associated simulation scenarios. This framework is demonstrated to inform how pressure and groundwater quality information can be integrated into site MMV and risk management plans.
Read full abstract