Abstract

Carbon ions are an up-and-coming ion species, currently being used in charged particle radiotherapy. As it is well established that there are considerable interindividual differences in radiosensitivity in the general population that can significantly influence clinical outcomes of radiotherapy, we evaluate the degree of these differences in the context of carbon ion therapy compared with conventional radiotherapy. In this study, we evaluate individual radiosensitivity following exposure to carbon-13 ions or γ-rays in peripheral blood lymphocytes of healthy individuals based on the frequency of ionizing radiation (IR)-induced DNA double strand breaks (DSBs) that was either misrepaired or left unrepaired to form chromosomal aberrations (CAs) (simply referred to here as DSBs for brevity). Levels of DSBs were estimated from the scoring of CAs visualized with telomere/centromere-fluorescence in situ hybridization (TC-FISH). We examine radiosensitivity at the dose of 2 Gy, a routinely administered dose during fractionated radiotherapy, and we determined that a wide range of DSBs were induced by the given dose among healthy individuals, with highly radiosensitive individuals harboring more IR-induced breaks in the genome than radioresistant individuals following exposure to the same dose. Furthermore, we determined the relative effectiveness of carbon irradiation in comparison to γ-irradiation in the induction of DSBs at each studied dose (isodose effect), a quality we term “relative dose effect” (RDE). This ratio is advantageous, as it allows for simple comparison of dose–response curves. At 2 Gy, carbon irradiation was three times more effective in inducing DSBs compared with γ-irradiation (RDE of 3); these results were confirmed using a second cytogenetic technique, multicolor-FISH. We also analyze radiosensitivity at other doses (0.2–15 Gy), to represent hypo- and hyperfractionation doses and determined that RDE is dose dependent: high ratios at low doses, and approaching 1 at high doses. These results could have clinical implications as IR-induced DNA damage and the ensuing CAs and genomic instability can have significant cellular consequences that could potentially have profound implications for long-term human health after IR exposure, such as the emergence of secondary cancers and other pathobiological conditions after radiotherapy.

Highlights

  • Current radiotherapy regimens use photons or protons for the treatment of a plethora of malignancies

  • We demonstrate that following in vitro irradiation with carbon ions or γ-rays at the dose of 2 Gy, a routinely administered dose during fractionated radiotherapy [10, 11], interindividual differences in radiosensitivity exist in healthy individuals

  • A given dose of ionizing radiation (IR) can induce a wide range of DNA damage among healthy individuals, with highly radiosensitive individuals harboring more IR-induced damage in the genome than radioresistant individuals following exposure to the same IR dose

Read more

Summary

Introduction

Current radiotherapy regimens use photons or protons for the treatment of a plethora of malignancies. High LET IR, such as heavy ions, on the other hand, are characterized by a relatively low entrance dose in the target material, followed by a pronounced sharp maximum dose near the end of their range called the Bragg peak, and energy close to 0 beyond the Bragg peak. This characteristic of high LET IR is useful especially for the treatment of deep-seated tumors in the human body, as it allows a great amount of energy to be precisely localized at the tumor site when it is placed at the Bragg peak, while minimally exposing the surrounding normal tissues [2]. Further investigation is necessary to characterize and understand how carbon ion therapy works in comparison to conventional radiotherapy

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call