Sensors used for control have become widespread in water resources recovery facilities during the strive for resource efficient operations. However, their accuracy is reliant on uncertain laboratory measurements, which are used for calibration and, in turn, to correct for sensor drift. At the same time, current sensor calibration practices are lacking clear theoretical understanding of how measurement uncertainties impact the final control action. The effects of a customarily, and ad hoc, applied calibration threshold are unknown, leading to the current situation where many wastewater treatment processes are controlled by measurements with unknown accuracy. To study how sensor accuracy is affected by calibration, including varying calibration thresholds, we developed a simple theoretical model with closed-form expressions based on the variance and bias in sensor and laboratory measurements. The model was then simulated to yield the results, which showed no practical gain of using a calibration threshold, apart from the situation when calibration is more time-consuming than validation. By contrast, the best accuracy was obtained when consistently executing calibration, which opposes common practice. Further, the sensor calibration error was shown to be transferred to the process, causing a similar deviation from the setpoint when the same sensor was used for control. This emphasizes the importance of minimizing laboratory measurement uncertainties during calibration, which otherwise directly impact operations. Due to these findings we strongly advice shifting mindset from considering calibration as a sequential detection and correction approach, towards an estimation approach, aiming to estimate bias magnitude and drift speed.
Read full abstract