Abstract

Water quality monitoring is essential to understanding the complex dynamics of water ecosystems, the impact of human infrastructure on them and to ensure the safe use of water resources for drinking, recreation and transport. High frequency in-situ monitoring systems are being increasingly employed in water quality monitoring schemes due to their much finer temporal measurement scales possible and reduced cost associated with manual sampling, manpower and time needed to process results compared to traditional grab-sampling. Modelling water quality data at higher frequency reduces uncertainty and allows for the capture of transient events, although due to potential constraints of data storage, inducement of noise, and power conservation it is worthwhile not using an excessively high sampling frequency. In this study, high frequency data recorded in Bristol's Floating Harbour as part of the local UKRIC Urban Observatory activities is presented to analyse events not captured by the current manual sampling and laboratory analysis scheme. The frequency components of the time-series are analysed to work towards understanding the necessary sampling frequency of temperature, dissolved oxygen (DO), fluorescent dissolved organic matter (fDOM), turbidity and conductivity as indicators of water quality. This study is the first of its kind to explore a statistical approach for determining the optimum sampling frequency for different water quality parameters using a high frequency dataset. Furthermore, it provides practical tools to understand how different sampling frequencies are representative of the water quality changes.

Highlights

  • The ability to monitor water quality is critical to managing precious freshwater resources for drinking water, recreation and ecosystem support purpose

  • This study further explores the issue of frequency optimisation by applying three different statistical approaches for determining the optimum sampling frequency for different water quality parameters using a high frequency dataset

  • The frequency analysis applied to the time series of temperature, dissolved oxygen (ODO), and fluorescent dissolved organic matter (fDOM) has shown that the sampling interval to adequately characterise the water body should be at least 6 h

Read more

Summary

Introduction

The ability to monitor water quality is critical to managing precious freshwater resources for drinking water, recreation and ecosystem support purpose. Until the 21st century, water quality monitoring networks (WQMNs) were typically reliant on manual (grab) sampling, followed by transportation to a laboratory for chemical and biological analysis (Strobl and Robillard, 2008; Tapparello et al, 2017) This provides adequate information for longterm monitoring of many water quality parameters but it can become time consuming and costly if needs to provide data for analysing short-term trends or changes in time and may be impractical for detecting rapid changes in variables that are highly sensitive to weather and other environmental influences, for example turbidity, temperature, conductivity and dissolved oxygen and dissolved organic matter (Ivanovsky et al, 2016). Technologies such as these make possible remote continuous real-time monitoring and visualisation of waterbody quality parameters at fixed locations (Tapparello et al, 2017; Chen and Han, 2018; Hadimani et al, 2021) and have been found to better describe water-bodies when compared to Consideration

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call