Abstract

The therapeutic use of ionizing radiations is predicated on sparing normal tissue effects while attempting to achieve lethal effects on tumor cells. From quite early in the history of radiation therapy, it was apparent that there were striking differences in effects in the panoply of normal tissues. Although there was early appreciation of some late effects in normal tissues, often not predicted by acute reactions, only in recent years has there been full documentation of the slow and progressive increase in severity of late damage. Pathophysiological mechanisms of acute and late radiation effects are better understood today (2), but interactions of other modalities with radiation therapy require constant monitoring to recognize and mitigate untoward sequelae. The work of Stone (3) is a classic example of unanticipated late effects, which resulted from irradiation with ‘fast neutrons. Acute reactions were moderate and tolerable, but the late sequelae were so marked that there was little interest in pursuing therapy with fast neutrons for nearly three decades. The Late Morbidity Scoring Criteria were developed as a joint effort between physicians with renewed interests in fast neutron therapy and Radiation Therapy Oncology Group (RTOG) staff. In the late 1970s the Neutron/Particle Committee was one of several modality committees of the RTOG. Recognizing the results of Stone, this committee, led by Lawrence Davis worked with RTOG staff to establish criteria and scoring for possible late effects from fast neutron radiation therapy. Investigators from the European Organization for Research and Treatment of Cancer (EORTC), led by William Duncan of the Western General Hospital of Edinburgh, wished to have common toxicity criteria in anticipation of joint studies. RTOG Protocol 7929, an international registry of patients treated with heavy particles, was started in 1980. At the annual meetings of the international participants in particle studies, there were attempts to monitor interobserver variations in scoring effects in normal tissues and to seek consistency in reporting toxicity, but no publications document these efforts. The first prospective trial to use the Late Morbidity Scoring Criteria was RTOG Protocol 8001, a study of fast neutron therapy for malignant tumors arising in salivary glands. Although the RTOG began to use these criteria in reporting toxicity in patients enrolled in all studies from 198 1 (beginning with RTOG Protocol 8 115), the criteria only became a published part of protocols in 1983. At that time, statistical methods began to be used, which presented time-adjusted estimates of late effects, the rationale for which was described by Cox (1). It is now considered standard to represent cumulative probabilities of late effects with methods similar to those for estimating local control and survival. The Acute Radiation Morbidity Scoring Criteria were developed in 1985 as complimentary to the Late Effects Scoring Criteria. The National Cancer Institute promulgated standard toxicity criteria in 1990, but late effects were not considered. An abbreviated version of the RTOG/EORTC toxicity criteria was published by Winchester and Cox in 1992 as part of the Standard for Breast Conservation Treatment. The current RTOG Acute Radiation Morbidity Scoring Criteria are presented in Table 1. The RTOG/EORTC Late Radiation Morbidity Scoring Scheme is detailed in Table 2. In both tables, 0 means an absence of radiation effects and 5 means the effects led to death. The severity

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call