Abstract

We reviewed available evidence in medical literature concerning experimental models of exposure to ionizing radiations (IR) and their mechanisms of producing damages on living organisms. The traditional model is based on the theory of “stochastic breakage” of one or both strands of the DNA double helix. According to this model, high doses may cause the breaks, potentially lethal to the cell by damaging both DNA strands, while low doses of IR would cause essentially single strands breaks, easily repairable, resulting in no permanent damages. The available evidence makes this classical model increasingly less acceptable, because the exposure to low doses of IR seems to have carcinogenic effects, even after years or decades, both in the exposed individuals and in subsequent generations. In addition, the cells that survived the exposure to low doses, despite being apparently normal, accumulate damages that become evident in their progeny, such as nonclonal chromosomal aberrations, which can be found even in cells not directly irradiated due to the exchange of molecular signals and complex tissue reactions involving neighboring or distant cells. For all these reasons, a paradigm shift is needed, based on evidence and epigenetics.

Highlights

  • The danger of ionizing radiations (IR) on human health is well known since the last century.There is a general agreement that high doses of IR represent a major threat to human health

  • The adoption of the patterns of exposure, risk assessment, and damage in environmental health ( IR) are inevitably affected by the way in which history determined and conditioned the research. It is for this reason that, to better understand the necessity of a paradigm shift, we need to start from a brief historical assessment of radiobiology, a discipline dominated by physicists who described for decades the interactions between radiations and living matter mainly in terms of energy transfers and DNA damage

  • The data concerning a large population of over 2 million children exposed in utero after Chernobyl, in Belarus and Greece, where the exposure was more consistent [49], and in Scotland, Wales, and Germany, where the exposure had been much lower [50,51] have recently led some researchers to calculate an increase of over 40% of cases of leukemia in children born in the period of the maximum peak of cesium in food, rather than in children born before the accident or during the two following years (31 December 1985—1 January 1988) [52]

Read more

Summary

Introduction

The danger of ionizing radiations (IR) on human health is well known since the last century. The limited total count of harmful effects in a population exposed to high total doses of IR and the supposed absence of major differences between more or less exposed subjects led the majority of researchers and international agencies to underestimate for decades the risks of radiations, especially regarding the effects of prolonged exposure to low doses, which are the most frequent and most dangerous for human being This serious error of judgment had its roots in the way in which the studies were conducted based on the LNT model and considering the total absorbed doses on the basis of the distance from the epicenter of the explosion, through extremely complex calculations, revised several times [13]

Increasing Evidence Fostering a Paradigm Change
The Chernobyl Lessons
Genomic Instability
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call