This study evaluated the reliability and equivalency of three different potable reuse paradigms: (1) surface water augmentation via de facto reuse with conventional wastewater treatment; (2) surface water augmentation via planned indirect potable reuse (IPR) with ultrafiltration, pre-ozone, biological activated carbon (BAC), and post-ozone; and (3) direct potable reuse (DPR) with ultrafiltration, ozone, BAC, and UV disinfection. A quantitative microbial risk assessment (QMRA) was performed to (1) quantify the risk of infection from Cryptosporidium oocysts; (2) compare the risks associated with different potable reuse systems under optimal and sub-optimal conditions; and (3) identify critical model/operational parameters based on sensitivity analyses. The annual risks of infection associated with the de facto and planned IPR systems were generally consistent with those of conventional drinking water systems [mean of (9.4 ± 0.3) × 10−5 to (4.5 ± 0.1) × 10−4], while DPR was clearly superior [mean of (6.1 ± 67) × 10−9 during sub-optimal operation]. Because the advanced treatment train in the planned IPR system was highly effective in reducing Cryptosporidium concentrations, the associated risks were generally dominated by the pathogen loading already present in the surface water. As a result, risks generally decreased with higher recycled water contributions (RWCs). Advanced treatment failures were generally inconsequential either due to the robustness of the advanced treatment train (i.e., DPR) or resiliency provided by the environmental buffer (i.e., planned IPR). Storage time in the environmental buffer was important for the de facto reuse system, and the model indicated a critical storage time of approximately 105 days. Storage times shorter than the critical value resulted in significant increases in risk. The conclusions from this study can be used to inform regulatory decision making and aid in the development of design or operational criteria for IPR and DPR systems.
Read full abstract