Abstract

New-technology testing such as gene-expression arrays and high-throughput cell-based assays provides a new window on assessing the impact of chemical exposures that directly examines effects at the level of the underlying biochemical machinery that controls and modulates the living system. Because such assays enable the testing of many chemicals in different conditions at low cost, these assays promise to help address the difficulty that traditional animal testing has in keeping up with increasing regulatory demands for fuller and more comprehensive chemical characterization. Examining a large array of gene-expression changes simultaneously provides multivariate data that are useful for data mining and statistical analysis of predictive profiles, even if the mechanistic role of each change is not well understood. In the future, however, the mechanistic interpretation of such data as embodiment of biological control processes, their perturbation, and their possible failure will become critical as primary observations, from which potential apical toxicity can be deduced without resorting to in vivo animal testing. The vision of such application put forth in the 2007 National Academy of Sciences report, Toxicity Testing in the 21st Century is discussed with what realization of that vision will mean for revision of risk assessment approaches, which are tied to the information available from testing considered. Even short of attainment of this vision, however, the new-technology data have useful applications as screening tools, as biomarkers, as diagnoses and characterizations of mode of action, in dose-response analysis, and as a means for characterizing interindividual variability. Possibilities, pitfalls, and impacts on risk assessment methods are described.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call