Ecotoxicology has long relied on assessing the hazard potential of chemicals through traditional in vivo testing methods to understand the possible risk exposure could pose to ecological taxa. In the past decade, the development of non-animal new approach methods (NAMs) for assessing chemical hazard and risk has quickly grown. These methods are often cheaper and faster than traditional toxicity testing, and thus are amenable to high-throughput toxicity testing (HTT), resulting in large datasets. The ToxCast/Tox21 HTT programs have produced in vitro data for thousands of chemicals covering a large space of biological activity. The relevance of these data to in vivo mammalian toxicity has been much explored. Interest has also grown in using these data to evaluate the risk of environmental exposures to taxa of ecological importance such as fish, aquatic invertebrates, etc.; particularly for the purpose of estimating the risk of exposure from real-world complex mixtures. Understanding the relationship and relative sensitivity of NAMs versus standardized ecotoxicological whole organism models is a key component of performing reliable read-across from mammalian in vitro data to ecotoxicological in vivo data. In this work, we explore the relationship between in vivo ecotoxicity data from several publicly available databases and the ToxCast/Tox21 data. We also performed several case studies in which we compare how using different ecotoxicity datasets, whether traditional or ToxCast-based, affects risk conclusions based on exposure to complex mixtures derived from existing large-scale chemical monitoring data. Generally, predictive value of ToxCast data for traditional in vivo endpoints (EPs) was poor (r ≤ 0.3). Risk conclusions, including identification of different chemical risk drivers and prioritized monitoring sites, were different when using HTT data vs. traditional in vivo data.
Read full abstract