The optimal use of radiation therapy for cancer treatment is hampered by the application of tolerance limits of normal tissues derived empirically from population averages. Such limits do not reflect the considerable differences from patient to patient in susceptibility to late radiation sequelae. Assays that accurately predict normal tissue tolerance in individual patients would permit real application of the concept of treatment to tolerance thereby increasing the probability of an uncomplicated cure for the population as a whole. A summary of laboratory research is presented to test the hypothesis that the cellular radiosensitivity of normal skin fibroblasts can predict the severity of late connective tissue damage that develops following radiotherapy. The pathogenesis of radiation reactions and the possible role of radiation induced cellular senescence in the development of clinical late effects are briefly reviewed. Although the pathogenesis of radiation injury is highly complex, several clinical studies have demonstrated a significant correlation between fibroblast radiosensitivity and the severity of late sequelae from treatment. However, the precision and reproducibility of fibroblast cell survival assays are inadequate for routine clinical use. Newer assays incorporating insights into the effects of radiation on cellular senescence and cytokine production are being developed. Such assays may, in the future, be complemented or replaced by molecular and/or cytogenetic probes to derive robust estimates of individual tolerance. The principle of prediction of tolerance to radiotherapy has been established. Although current assays lack the precision required for clinical use, the goal of individualized treatment to tolerance ultimately should be achieved.
Read full abstract