Abstract

New approaches for toxicity testing: The US National Research Council report on ‘Toxicity Testing in the 21st Century’ (Krewski et al. 2010) envisioned a shift in testing away from studies of apical endpoints in test animals to the use of human cells to assess perturbations of toxicity pathways (TPs). The report generated widespread interest and has produced subsequent discussions regarding implementation of its key recommendations (Andersen and Krewski 2010; Krewski et al. 2011, 2014). TPs were defined as normal cellular signaling pathways that could serve as targets of toxicity in the face of perturbations of their function by chemical exposures. The examples provided in the report included sex steroid hormone receptor pathways, liver nuclear receptor signaling, and the suite of eight canonical stress pathways, including oxidative stress, DNA damage, heat shock, hypoxia, metal stress, inflammation, endoplasmic reticulum stress, and oxidative stress (Simmons et al. 2009). This aggregation of pathways, based largely on preexisting biological information, remains coarse-grained with many possible nodes in each of these TPs whose alterations could lead to toxicity. Some of the continuing challenges in advancing new, cell-based methods for toxicity testing are (1) the manner in which testing will be accomplished, (2) the degree of detail required to define the biological targets whose alterations lead to toxicity, and (3) the biological granularity underpinning definitions of toxicity pathways.

Highlights

  • The path forward clearly requires use of multiple approaches for identifying targets of toxicity and the methods for querying how biological perturbations of these targets lead to toxic responses

  • In addition to looking at chemicals or chemical libraries where there is significant preexisting knowledge about molecular initiating event (MIE), other assays that are agnostic with respect to pathway targets and provide a breadth of information to infer either targets or infer safe exposures are needed

  • Advancing toxicity test information content through the development of PoTs: A continuing question relates to the level of detail to be included in defining PoTs and the bioinformatic and computational tools required in creating both static and dynamic representations of PoTs. Highcontent data streams, such as gene expression microarrays, chromatin immunoprecipitation (ChIP), and metabolomics, are well-established tools for molecular biology. Output from these technologies need to be integrated to provide more than lists of features altered following treatment of cells with or exposures of test animals to particular chemicals

Read more

Summary

Introduction

The path forward clearly requires use of multiple approaches for identifying targets of toxicity and the methods for querying how biological perturbations of these targets lead to toxic responses. The high-throughput screening and case study approaches rely on preexisting knowledge to develop assays, design readouts, and propose interpretive tools for use of the information in human safety assessments.

Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call