Abstract

Enhanced coagulation can be an effective way to reduce disinfection by-product (DBP) precursor concentrations. Where turbidity is not extremely high, the natural organic matter concentration evaluated by total or dissolved organic carbon concentration or UV absorbance is known to be the most important factor for determining the adequate coagulant dose. Yet, treatment plant operators are often faced with difficult decisions when it comes to coagulant dosages: Should coagulation efforts and coagulant doses be consistent year-round when water quality changes seasonally? Should the coagulant dose be increased when DBP standards are not met, or has the maximum removal of DBP precursors been reached? The objective and novelty of this study is to revisit the concept of enhanced coagulation and to determine optimal coagulation guidelines based not just on the removal of common indicators such as DOC but on the removal of actual DBP precursors. Jar-tests (for DBP precursor removal evaluation) using alum were conducted under a range of conditions on 8 different natural/synthetic waters with varying physicochemical characteristics for subsequent chlorination over 48 h (for DBP formation potential). A coagulant-dose adjustment strategy based on UV254 monitoring was also implemented at a full-scale facility. Results show that, for the wide range of waters tested, an alum/UV254 stoichiometric dose of 180 ± 25 mg alum cm/L represents a point of diminishing return (i.e. it maximises DBP precursor removal). Another original result of this work is that this dose is applicable and equally efficient in all seasons, despite changes in water quality. For utilities with similar raw waters, this means that coagulation efforts should be proportional to the UV254 of the raw water, regardless of the season.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call