10.1517/14622416.5.8.1163 © Assessing toxicity is among the most ancient and important of human endeavors. Long before the taming of fire or the advent of tool use, humankind was confronted with a most basic problem of existence: what to ingest. Today, toxicology impacts every facet of our lives – from the food, supplements and therapeutics that we consume, to the consumer products that we utilize and, ultimately, the environment in which we live. Toxicity must, therefore, be considered not only as a problem for basic science, but in the context of safety and its accompanying regulatory imperatives. In its most basic form, toxicity testing can be distilled down to two concepts: exposure and assessment. A model system – typically a laboratory animal – is exposed to a test article, and then examined to determine whether any adverse effects have occurred. Stated differently, a biological system is perturbed and the resulting phenotype is characterized. Phenotype is conventionally defined as the visible properties of an organism resulting from gene–environment interactions. Detailed phenotypic characterization in the form of histopathology has dominated the field of toxicology and provided a strong basis for understanding specific mechanisms of toxicity and, more generally, the effects of chemical exposure and its relationship to disease. Importantly, the depth with which the effects of chemical exposure can now be described has been fundamentally enhanced through the introduction of comprehensive profiling technologies. Toxicogenomics, in its most narrow sense, represents a synthesis of toxicology and genomics in an effort to understand mechanisms of toxicity through the use of genomic information, such as gene expression modulation captured using microarrays [1]. An advantage of this approach is that a large number of attributes – in this case, transcripts – can be used as descriptors of the biological system under scrutiny. In principle, the potential for accurately differentiating between closely related systems and the outcomes induced in these systems could be enhanced by enlarging the potential descriptor space. Thus, a profile, or signature, of gene expression can be used to classify mechanisms of toxicity [2]. But genomics represents only a piece of the puzzle, and alternative data streams, such as proteomics [3] and metabolomics [4], can also be used to provide highcontent system descriptors. Proponents of each technique can point to technological advantages of their chosen method, as well as state their case as to the comparative relevance of each data type for understanding mechanisms of toxicity. Clearly, no single method can provide a truly comprehensive view of the state of a biological system, nor can any single method adequately describe the interplay between genes, proteins and metabolites in a way that reflects the dynamics of the system. The goal of using molecular profiling techniques in the context of toxicology should be to move the science beyond the mere classification of known compounds and test articles and into the realm of real understanding and knowledge about the specific mechanisms, whereby chemical and environmental agents effect adverse outcomes. For this, we must capture data from multiple sources and provide a holistic framework for their integration. Toxicogenomics must, therefore, expand beyond the narrow scope of its own definition to encompass multiple, high-content data streams in a manner that accounts for the interdependencies inherent between the elements that comprise a system. In short, we must advance toxicology toward systems biology. Systems biology has been described in many ways, but the essential elements of the original definition remain the most relevant: systematically perturbing a biological system, monitoring the behavior of the elements that make up a system (e.g., genes, proteins, and metabolites), integrating these data, and describing the structure of their interrelationships using mathematical models in an effort to account for emergent properties of the system [5]. The perceived promise of a systems approach to biological problems has led to a great deal of interest and activity in the pharmaceutical, biotechnology and educational sectors. In particular, enhancing our understanding of safety and efficacy issues as a prelude to personalized medicine has galvanized previously disparate efforts. Significant progress has been made in developing a suite of tools for perturbing biological systems by changing their
Read full abstract