Abstract

Providing reliable environmental quality standards (EQSs) is a challenging issue in environmental risk assessment (ERA). These EQSs are derived from toxicity endpoints estimated from dose-response models to identify and characterize the environmental hazard of chemical compounds released by human activities. These toxicity endpoints include the classical x% effect/lethal concentrations at a specific time t (EC/LC(x, t)) and the new multiplication factors applied to environmental exposure profiles leading to x% effect reduction at a specific time t (MF(x, t), or denoted LP(x, t) by the EFSA). However, classical dose-response models used to estimate toxicity endpoints have some weaknesses, such as their dependency on observation time points, which are likely to differ between species (e.g., experiment duration). Furthermore, real-world exposure profiles are rarely constant over time, which makes the use of classical dose-response models difficult and may prevent the derivation of MF(x, t). When dealing with survival or immobility toxicity test data, these issues can be overcome with the use of the general unified threshold model of survival (GUTS), a toxicokinetic-toxicodynamic (TKTD) model that provides an explicit framework to analyse both time- and concentration-dependent data sets as well as obtain a mechanistic derivation of EC/LC(x, t) and MF(x, t) regardless of x and at any time t of interest. In ERA, the assessment of a risk is inherently built upon probability distributions, such that the next critical step is to characterize the uncertainties of toxicity endpoints and, consequently, those of EQSs. With this perspective, we investigated the use of a Bayesian framework to obtain the uncertainties from the calibration process and to propagate them to model predictions, including LC(x, t) and MF(x, t) derivations. We also explored the mathematical properties of LC(x, t) and MF(x, t) as well as the impact of different experimental designs to provide some recommendations for a robust derivation of toxicity endpoints leading to reliable EQSs: avoid computing LC(x, t) and MF(x, t) for extreme x values (0 or 100%), where uncertainty is maximal; compute MF(x, t) after a long period of time to take depuration time into account and test survival under pulses with different periods of time between them.

Highlights

  • Assessing the environmental risk of chemical compounds requires the definition of environmental quality standards (EQSs)

  • The general unified threshold model of survival (GUTS)-RED-IT model is based on the critical body residue (CBR) approach, which assumes that individuals differ in their thresholds, following a probability distribution, and die as soon as the internal concentration reaches the individual-specific threshold[10]

  • For all compounds, fitting observed survival with test data obtained under constant exposure profiles provides better fits than using data from testing under time-variable exposure profiles (Table 2, see posterior predictive check graphics in Supplementary Material), regardless of the measure of goodness-of-fit

Read more

Summary

Introduction

Assessing the environmental risk of chemical compounds requires the definition of environmental quality standards (EQSs). Www.nature.com/scientificreports to extrapolate the results to more realistic scenarios with time-variable exposure profiles combining different heights, widths and frequencies of contaminant pulses[6,7,8,9] To overcome this limitation at the organism level, the use of mechanistic models, such as toxicokinetic-toxicodynamic (TKTD) models, is promoted to describe the effects of a substance of interest by integrating the dynamics of the exposure[1,10,11]. TKTD models appear highly advantageous in terms of gaining a mechanistic understanding of the chemical mode of action, deriving time-independent parameters, interpreting time-varying exposure and making predictions under untested conditions[9,10] Another advantage of TKTD models for ERA is the possible calculation of lethal concentrations for any x% of the population at any given exposure duration t, denoted LC(x, t). Bayesian inference is tailored for decision making as it provides assessors with a range of values rather than a single point, which is valuable in risk assessment[16,19]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call