Radiation hormesis is the hypothesis that low doses of ionizing radiation (within the region of and just above natural background levels) are beneficial, stimulating the activation of repair mechanisms that protect against disease, that are not activated in absence of ionizing radiation. The reserve repair mechanisms are hypothesized to be sufficiently effective when stimulated as to not only cancel the detrimental effects of ionizing radiation but also inhibit disease not related to radiation exposure (see hormesis). [1] [2] [3] [4] It has been a mainstream concept since at least 2009. [5] [ unreliable source? ]
While the effects of high and acute doses of ionising radiation are easily observed and understood in humans (e.g. Japanese atomic bomb survivors), the effects of low-level radiation are very difficult to observe and highly controversial. This is because the baseline cancer rate is already very high and the risk of developing cancer fluctuates 40% because of individual life style and environmental effects, [6] [7] obscuring the subtle effects of low-level radiation. An acute effective dose of 100 millisieverts may increase cancer risk by ~0.8%. However, children are particularly sensitive to radioactivity, with childhood leukemias and other cancers increasing even within natural and man-made background radiation levels (under 4 mSv cumulative with 1 mSv being an average annual dose from terrestrial and cosmic radiation, excluding radon which primarily doses the lung). [8] [9] There is limited evidence that exposures around this dose level will cause negative subclinical health impacts to neural development. [10] Students born in regions of higher Chernobyl fallout performed worse in secondary school, particularly in mathematics. "Damage is accentuated within families (i.e., siblings comparison) and among children born to parents with low education..." who often don't have the resources to overcome this additional health challenge. [11]
Hormesis remains largely unknown to the public. Government and regulatory bodies disagree on the existence of radiation hormesis and research points to the "severe problems and limitations" with the use of hormesis in general as the "principal dose-response default assumption in a risk assessment process charged with ensuring public health protection." [12]
Quoting results from a literature database research, the Académie des Sciences – Académie nationale de Médecine (French Academy of Sciences – National Academy of Medicine) stated in their 2005 report concerning the effects of low-level radiation that many laboratory studies have observed radiation hormesis. [13] [14] However, they cautioned that it is not yet known if radiation hormesis occurs outside the laboratory, or in humans. [15]
Reports by the United States National Research Council and the National Council on Radiation Protection and Measurements and the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) argue [16] that there is no evidence for hormesis in humans and in the case of the National Research Council hormesis is outright rejected as a possibility. [17] Therefore, estimating linear no-threshold model (LNT) continues to be the model generally used by regulatory agencies for human radiation exposure.
Radiation hormesis proposes that radiation exposure comparable to and just above the natural background level of radiation is not harmful but beneficial, while accepting that much higher levels of radiation are hazardous. Proponents of radiation hormesis typically claim that radio-protective responses in cells and the immune system not only counter the harmful effects of radiation but additionally act to inhibit spontaneous cancer not related to radiation exposure. Radiation hormesis stands in stark contrast to the more generally accepted linear no-threshold model (LNT), which states that the radiation dose-risk relationship is linear across all doses, so that small doses are still damaging, albeit less so than higher ones. Opinion pieces on chemical and radiobiological hormesis appeared in the journals Nature [1] and Science [3] in 2003.
Assessing the risk of radiation at low doses (<100 mSv) and low dose rates (<0.1 mSv.min−1) is highly problematic and controversial. [18] [19] While epidemiological studies on populations of people exposed to an acute dose of high level radiation such as Japanese atomic bomb survivors (hibakusha (被爆者)) have robustly upheld the LNT (mean dose ~210 mSv), [20] studies involving low doses and low dose rates have failed to detect any increased cancer rate. [19] This is because the baseline cancer rate is already very high (~42 of 100 people will be diagnosed in their lifetime) and it fluctuates ~40% because of lifestyle and environmental effects, [7] [21] obscuring the subtle effects of low level radiation. Epidemiological studies may be capable of detecting elevated cancer rates as low as 1.2 to 1.3 i.e. 20% to 30% increase. But for low doses (1–100 mSv) the predicted elevated risks are only 1.001 to 1.04 and excess cancer cases, if present, cannot be detected due to confounding factors, errors and biases. [21] [22] [23]
In particular, variations in smoking prevalence or even accuracy in reporting smoking cause wide variation in excess cancer and measurement error bias. Thus, even a large study of many thousands of subjects with imperfect smoking prevalence information will fail to detect the effects of low level radiation than a smaller study that properly compensates for smoking prevalence. [24] Given the absence of direct epidemiological evidence, there is considerable debate as to whether the dose-response relationship <100 mSv is supralinear, linear (LNT), has a threshold, is sub-linear, or whether the coefficient is negative with a sign change, i.e. a hormetic response.
The radiation adaptive response seems to be a main origin of the potential hormetic effect. The theoretical studies indicate that the adaptive response is responsible for the shape of dose-response curve and can transform the linear relationship (LNT) into the hormetic one. [25] [26]
While most major consensus reports and government bodies currently adhere to LNT, [27] the 2005 French Academy of Sciences-National Academy of Medicine's report concerning the effects of low-level radiation rejected LNT as a scientific model of carcinogenic risk at low doses. [15]
Using LNT to estimate the carcinogenic effect at doses of less than 20 mSv is not justified in the light of current radiobiologic knowledge.
They consider there to be several dose-effect relationships rather than only one, and that these relationships have many variables such as target tissue, radiation dose, dose rate and individual sensitivity factors. They request that further study is required on low doses (less than 100 mSv) and very low doses (less than 10 mSv) as well as the impact of tissue type and age. The Academy considers the LNT model is only useful for regulatory purposes as it simplifies the administrative task. Quoting results from literature research, [13] [14] they furthermore claim that approximately 40% of laboratory studies on cell cultures and animals indicate some degree of chemical or radiobiological hormesis, and state:
...its existence in the laboratory is beyond question and its mechanism of action appears well understood.
They go on to outline a growing body of research that illustrates that the human body is not a passive accumulator of radiation damage but it actively repairs the damage caused via a number of different processes, including: [15] [19]
Furthermore, increased sensitivity to radiation induced cancer in the inherited condition Ataxia-telangiectasia like disorder, illustrates the damaging effects of loss of the repair gene Mre11h resulting in the inability to fix DNA double-strand breaks. [28]
The BEIR-VII report argued that, "the presence of a true dose threshold demands totally error-free DNA damage response and repair." The specific damage they worry about is double strand breaks (DSBs) and they continue, "error-prone nonhomologous end joining (NHEJ) repair in postirradiation cellular response, argues strongly against a DNA repair-mediated low-dose threshold for cancer initiation". [29] Recent research observed that DSBs caused by CAT scans are repaired within 24 hours and DSBs may be more efficiently repaired at low doses, suggesting that the risk of ionizing radiation at low doses may not be directly proportional to the dose. [30] [31] However, it is not known if low-dose ionizing radiation stimulates the repair of DSBs not caused by ionizing radiation i.e. a hormetic response.
Radon gas in homes is the largest source of radiation dose for most individuals and it is generally advised that the concentration be kept below 150 Bq/m³ (4 pCi/L). [32] A recent retrospective case-control study of lung cancer risk showed substantial cancer rate reduction between 50 and 123 Bq per cubic meter relative to a group at zero to 25 Bq per cubic meter. [33] This study is cited as evidence for hormesis, but a single study all by itself cannot be regarded as definitive. Other studies into the effects of domestic radon exposure have not reported a hormetic effect; including for example the respected "Iowa Radon Lung Cancer Study" of Field et al. (2000), which also used sophisticated radon exposure dosimetry. [34] In addition, Darby et al. (2005) argue that radon exposure is negatively correlated with the tendency to smoke and environmental studies need to accurately control for this; people living in urban areas where smoking rates are higher usually have lower levels of radon exposure due to the increased prevalence of multi-story dwellings. [35] When doing so, they found a significant increase in lung cancer amongst smokers exposed to radon at doses as low as 100 to 199 Bq m−3 and warned that smoking greatly increases the risk posed by radon exposure i.e. reducing the prevalence of smoking would decrease deaths caused by radon. [35] [36] However, the discussion about the opposite experimental results is still going on, [37] especially the popular US and German studies have found some hormetic effects. [38] [39]
Furthermore, particle microbeam studies show that passage of even a single alpha particle (e.g. from radon and its progeny) through cell nuclei is highly mutagenic, [40] and that alpha radiation may have a higher mutagenic effect at low doses (even if a small fraction of cells are hit by alpha particles) than predicted by linear no-threshold model, a phenomenon attributed to bystander effect. [41] However, there is currently insufficient evidence at hand to suggest that the bystander effect promotes carcinogenesis in humans at low doses. [42]
Radiation hormesis has not been accepted by either the United States National Research Council, [17] or the National Council on Radiation Protection and Measurements (NCRP). [43] In May 2018, the NCRP published the report of an interdisciplinary group of radiation experts who critically reviewed 29 high-quality epidemiologic studies of populations exposed to radiation in the low dose and low dose-rate range, mostly published within the last 10 years. [44] The group of experts concluded:
The recent epidemiologic studies support the continued use of the LNT model for radiation protection. This is in accord with judgments by other national and international scientific committees, based on somewhat older data, that no alternative dose-response relationship appears more pragmatic or prudent for radiation protection purposes than the LNT model.
In addition, the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR) wrote in its 2000 report: [45]
Until the [...] uncertainties on low-dose response are resolved, the Committee believes that an increase in the risk of tumour induction proportionate to the radiation dose is consistent with developing knowledge and that it remains, accordingly, the most scientifically defensible approximation of low-dose response. However, a strictly linear dose response should not be expected in all circumstances.
This is a reference to the fact that very low doses of radiation have only marginal impacts on individual health outcomes. It is therefore difficult to detect the 'signal' of decreased or increased morbidity and mortality due to low-level radiation exposure in the 'noise' of other effects. The notion of radiation hormesis has been rejected by the National Research Council's (part of the National Academy of Sciences) 16-year-long study on the Biological Effects of Ionizing Radiation. "The scientific research base shows that there is no threshold of exposure below which low levels of ionizing radiation can be demonstrated to be harmless or beneficial. The health risks – particularly the development of solid cancers in organs – rise proportionally with exposure" says Richard R. Monson, associate dean for professional education and professor of epidemiology, Harvard School of Public Health, Boston. [46] [17]
The possibility that low doses of radiation may have beneficial effects (a phenomenon often referred to as "hormesis") has been the subject of considerable debate. Evidence for hormetic effects was reviewed, with emphasis on material published since the 1990 BEIR V study on the health effects of exposure to low levels of ionizing radiation. Although examples of apparent stimulatory or protective effects can be found in cellular and animal biology, the preponderance of available experimental information does not support the contention that low levels of ionizing radiation have a beneficial effect. The mechanism of any such possible effect remains obscure. At this time, the assumption that any stimulatory hormetic effects from low doses of ionizing radiation will have a significant health benefit to humans that exceeds potential detrimental effects from radiation exposure at the same dose is unwarranted.
Kerala's monazite sand (containing a third of the world's economically recoverable reserves of radioactive thorium) emits about 8 microsieverts per hour of gamma radiation, 80 times the dose rate equivalent in London, but a decade-long study of 69,985 residents published in Health Physics in 2009 "showed no excess cancer risk from exposure to terrestrial gamma radiation. The excess relative risk of cancer excluding leukemia was estimated to be −0.13 per Gy (95% CI: −0.58, 0.46)", indicating no statistically significant positive or negative relationship between background radiation levels and cancer risk in this sample. [47]
Studies in cell cultures can be useful for finding mechanisms for biological processes, but they also can be criticized for not effectively capturing the whole of the living organism.
A study by E. I. Azzam suggested that pre-exposure to radiation causes cells to turn on protection mechanisms. [48] A different study by de Toledo and collaborators has shown that irradiation with gamma rays increases the concentration of glutathione, an antioxidant found in cells. [49]
In 2011, an in vitro study led by S. V. Costes showed in time-lapse images a strongly non-linear response of certain cellular repair mechanisms called radiation-induced foci (RIF). The study found that low doses of radiation prompted higher rates of RIF formation than high doses, and that after low-dose exposure RIF continued to form after the radiation had ended. Measured rates of RIF formation were 15 RIF/Gy at 2 Gy, and 64 RIF/Gy at 0.1 Gy. [31] These results suggest that low dose levels of ionizing radiation may not increase cancer risk directly proportional to dose and thus contradict the linear-no-threshold standard model. [50] Mina Bissell, a world-renowned breast-cancer researcher and collaborator in this study stated: "Our data show that at lower doses of ionizing radiation, DNA repair mechanisms work much better than at higher doses. This non-linear DNA damage response casts doubt on the general assumption that any amount of ionizing radiation is harmful and additive." [50]
An early study on mice exposed to low dose of radiation daily (0.11 R per day) suggest that they may outlive control animals. [51] A study by Otsuka and collaborators found hormesis in animals. [52] Miyachi conducted a study on mice and found that a 200 mGy X-ray dose protects mice against both further X-ray exposure and ozone gas. [53] In another rodent study, Sakai and collaborators found that (1 mGy/h) gamma irradiation prevents the development of cancer (induced by chemical means, injection of methylcholanthrene). [54]
In a 2006 paper, [55] a dose of 1 Gy was delivered to the cells (at constant rate from a radioactive source) over a series of lengths of time. These were between 8.77 and 87.7 hours, the abstract states for a dose delivered over 35 hours or more (low dose rate) no transformation of the cells occurred. Also for the 1 Gy dose delivered over 8.77 to 18.3 hours that the biological effect (neoplastic transformation) was about "1.5 times less than that measured at high dose rate in previous studies with a similar quality of [X-ray] radiation". Likewise it has been reported that fractionation of gamma irradiation reduces the likelihood of a neoplastic transformation. [56] Pre-exposure to fast neutrons and gamma rays from Cs-137 is reported to increase the ability of a second dose to induce a neoplastic transformation. [57]
Caution must be used in interpreting these results, as it noted in the BEIR VII report, these pre-doses can also increase cancer risk: [17]
In chronic low-dose experiments with dogs (75 mGy/d for the duration of life), vital hematopoietic progenitors showed increased radioresistance along with renewed proliferative capacity (Seed and Kaspar 1992). Under the same conditions, a subset of animals showed an increased repair capacity as judged by the unscheduled DNA synthesis assay (Seed and Meyers 1993). Although one might interpret these observations as an adaptive effect at the cellular level, the exposed animal population experienced a high incidence of myeloid leukemia and related myeloproliferative disorders. The authors concluded that "the acquisition of radioresistance and associated repair functions under the strong selective and mutagenic pressure of chronic radiation is tied temporally and causally to leukemogenic transformation by the radiation exposure" (Seed and Kaspar 1992).
However, 75 mGy/d cannot be accurately described as a low dose rate – it is equivalent to over 27 sieverts per year. The same study on dogs showed no increase in cancer nor reduction in life expectancy for dogs irradiated at 3 mGy/d. [58]
In long-term study of Chernobyl disaster liquidators [59] was found that: "During current research paradoxically longer telomeres were found among persons, who have received heavier long-term irradiation." and "Mortality due to oncologic diseases was lower than in general population in all age groups that may reflect efficient health care of this group." Though in conclusion interim results were ignored and conclusion followed LNT hypothesis: "The signs of premature aging were found in Chernobyl disaster clean-up workers; moreover, aging process developed in heavier form and at younger age in humans, who underwent greater exposure to ionizing radiation."
A study of survivors of the Hirsohima atomic bomb explosion yielded similar results. [60]
In an Australian study which analyzed the association between solar UV exposure and DNA damage, the results indicated that although the frequency of cells with chromosome breakage increased with increasing sun exposure, the misrepair of DNA strand breaks decreased as sun exposure was heightened. [61]
The health of the inhabitants of radioactive apartment buildings in Taiwan has received prominent attention. In 1982, more than 20,000 tons of steel was accidentally contaminated with cobalt-60, and much of this radioactive steel was used to build apartments and exposed thousands of Taiwanese to gamma radiation levels of up to >1000 times background (average 47.7 mSv, maximum 2360 mSv excess cumulative dose). The radioactive contamination was discovered in 1992.
A seriously flawed 2004 study compared the building's younger residents with the much older general population of Taiwan and determined that the younger residents were less likely to have been diagnosed with cancer than older people; this was touted as evidence of a radiation hormesis effect. [62] [63] (Older people have much higher cancer rates even in the absence of excess radiation exposure.)
In the years shortly after exposure, the total number cancer cases have been reported to be either lower than the society-wide average or slightly elevated. [64] [65] Leukaemia and thyroid cancer were substantially elevated. [62] [64] When a lower rate of "all cancers" was found, it was thought to be due to the exposed residents having a higher socioeconomic status, and thus overall healthier lifestyle. [62] [64] Additionally, Hwang, et al. cautioned in 2006 that leukaemia was the first cancer type found to be elevated amongst the survivors of the Hiroshima and Nagasaki bombings, so it could be decades before any increase in more common cancer types is seen. [62]
Besides the excess risks of leukaemia and thyroid cancer, a later publication notes various DNA anomalies and other health effects among the exposed population: [66]
There have been several reports concerning the radiation effects on the exposed population, including cytogenetic analysis that showed increased micronucleus frequencies in peripheral lymphocytes in the exposed population, increases in acentromeric and single or multiple centromeric cytogenetic damages, and higher frequencies of chromosomal translocations, rings and dicentrics. Other analyses have shown persistent depression of peripheral leucocytes and neutrophils, increased eosinophils, altered distributions of lymphocyte subpopulations, increased frequencies of lens opacities, delays in physical development among exposed children, increased risk of thyroid abnormalities, and late consequences in hematopoietic adaptation in children.
People living in these buildings also experienced infertility. [67]
Intentional exposure to water and air containing increased amounts of radon is perceived as therapeutic, and "radon spas" can be found in United States, Czechia, Poland, Germany, Austria and other countries.
Given the uncertain effects of low-level and very-low-level radiation, there is a pressing need for quality research in this area. An expert panel convened at the 2006 Ultra-Low-Level Radiation Effects Summit at Carlsbad, New Mexico, proposed the construction of an Ultra-Low-Level Radiation laboratory. [68] The laboratory, if built, will investigate the effects of almost no radiation on laboratory animals and cell cultures, and it will compare these groups to control groups exposed to natural radiation levels. Precautions would be made, for example, to remove potassium-40 from the food of laboratory animals. The expert panel believes that the Ultra-Low-Level Radiation laboratory is the only experiment that can explore with authority and confidence the effects of low-level radiation; that it can confirm or discard the various radiobiological effects proposed at low radiation levels e.g. LNT, threshold and radiation hormesis. [69]
The first preliminary results of the effects of almost no-radiation on cell cultures was reported by two research groups in 2011 and 2012; researchers in the US studied cell cultures protected from radiation in a steel chamber 650 meters underground at the Waste Isolation Pilot Plant in Carlsbad, New Mexico [70] and researchers in Europe proposed an experiment design to study the effects of almost no-radiation on mouse cells (pKZ1 transgenic chromosomal inversion assay), but did not carry out the experiment. [71]
Background radiation is a measure of the level of ionizing radiation present in the environment at a particular location which is not due to deliberate introduction of radiation sources.
Acute radiation syndrome (ARS), also known as radiation sickness or radiation poisoning, is a collection of health effects that are caused by being exposed to high amounts of ionizing radiation in a short period of time. Symptoms can start within an hour of exposure, and can last for several months. Early symptoms are usually nausea, vomiting and loss of appetite. In the following hours or weeks, initial symptoms may appear to improve, before the development of additional symptoms, after which either recovery or death follow.
The sievert is a unit in the International System of Units (SI) intended to represent the stochastic health risk of ionizing radiation, which is defined as the probability of causing radiation-induced cancer and genetic damage. The sievert is important in dosimetry and radiation protection. It is named after Rolf Maximilian Sievert, a Swedish medical physicist renowned for work on radiation dose measurement and research into the biological effects of radiation.
Ionizing radiation, including nuclear radiation, consists of subatomic particles or electromagnetic waves that have sufficient energy to ionize atoms or molecules by detaching electrons from them. Some particles can travel up to 99% of the speed of light, and the electromagnetic waves are on the high-energy portion of the electromagnetic spectrum.
Radiation dosimetry in the fields of health physics and radiation protection is the measurement, calculation and assessment of the ionizing radiation dose absorbed by an object, usually the human body. This applies both internally, due to ingested or inhaled radioactive substances, or externally due to irradiation by sources of radiation.
Hormesis is a two-phased dose-response relationship to an environmental agent whereby low-dose amounts have a beneficial effect and high-dose amounts are either inhibitory to function or toxic. Within the hormetic zone, the biological response to low-dose amounts of some stressors is generally favorable. An example is the breathing of oxygen, which is required in low amounts via respiration in living animals, but can be toxic in high amounts, even in a managed clinical setting.
The linear no-threshold model (LNT) is a dose-response model used in radiation protection to estimate stochastic health effects such as radiation-induced cancer, genetic mutations and teratogenic effects on the human body due to exposure to ionizing radiation. The model assumes a linear relationship between dose and health effects, even for very low doses where biological effects are more difficult to observe. The LNT model implies that all exposure to ionizing radiation is harmful, regardless of how low the dose is, and that the effect is cumulative over lifetime.
Downwinders were individuals and communities in the intermountain West between the Cascade and Rocky Mountain ranges primarily in Arizona, Nevada, New Mexico, and Utah but also in Oregon, Washington, and Idaho who were exposed to radioactive contamination or nuclear fallout from atmospheric or underground nuclear weapons testing, and nuclear accidents.
Bernard Leonard Cohen was born in Pittsburgh, and was Professor Emeritus of Physics at the University of Pittsburgh. Professor Cohen was a staunch opponent of the so-called Linear no-threshold model (LNT) which postulates there exists no safe threshold for radiation exposure. His view which has support from a minority. He died in March 2012.
Radium and radon are important contributors to environmental radioactivity. Radon occurs naturally as a result of decay of radioactive elements in soil and it can accumulate in houses built on areas where such decay occurs. Radon is a major cause of cancer; it is estimated to contribute to ~2% of all cancer related deaths in Europe.
Radiobiology is a field of clinical and basic medical sciences that involves the study of the effects of ionizing radiation on living things, in particular health effects of radiation. Ionizing radiation is generally harmful and potentially lethal to living things but can have health benefits in radiation therapy for the treatment of cancer and thyrotoxicosis. Its most common impact is the induction of cancer with a latent period of years or decades after exposure. High doses can cause visually dramatic radiation burns, and/or rapid fatality through acute radiation syndrome. Controlled doses are used for medical imaging and radiotherapy.
Christopher Busby is a British scientist primarily studying the health effects of internal ionising radiation. Busby is a director of Green Audit Limited, a private company, and scientific advisor to the Low Level Radiation Campaign (LLRC).
The effects of the 1979 Three Mile Island nuclear accident are widely agreed to be very low by scientists in the relevant fields. The American Nuclear Society concluded that average local radiation exposure was equivalent to a chest X-ray and maximum local exposure equivalent to less than a year's background radiation. The U.S. BEIR report on the Biological Effects of Ionizing Radiation states that "the collective dose equivalent resulting from the radioactivity released in the Three Mile Island accident was so low that the estimated number of excess cancer cases to be expected, if any were to occur, would be negligible and undetectable." A variety of epidemiology studies have concluded that the accident has had no observable long term health effects. One dissenting study is "a re-evaluation of cancer incidence near the Three Mile Island nuclear plant" by Dr Steven Wing of the University of North Carolina. In this study, Dr Wing and his colleagues argue that earlier findings had "logical and methodological problems" and conclude that "cancer incidence, specifically lung cancer and leukemia, increased following the TMI accident in areas estimated to have been in the pathway of radioactive plumes than in other areas." Other dissenting opinions can be found in the Radiation and Public Health Project, whose leader, Joseph Mangano, has questioned the safety of nuclear power since 1985.
The health effects of radon are harmful, and include an increased chance of lung cancer. Radon is a radioactive, colorless, odorless, tasteless noble gas, which has been studied by a number of scientific and medical bodies for its effects on health. A naturally-occurring gas formed as a decay product of radium, radon is one of the densest substances that remains a gas under normal conditions, and is considered to be a health hazard due to its radioactivity. Its most stable isotope, radon-222, has a half-life of 3.8 days. Due to its high radioactivity, it has been less well studied by chemists, but a few compounds are known.
Exposure to ionizing radiation is known to increase the future incidence of cancer, particularly leukemia. The mechanism by which this occurs is well understood, but quantitative models predicting the level of risk remain controversial. The most widely accepted model posits that the incidence of cancers due to ionizing radiation increases linearly with effective radiation dose at a rate of 5.5% per sievert; if correct, natural background radiation is the most hazardous source of radiation to general public health, followed by medical imaging as a close second. Additionally, the vast majority of non-invasive cancers are non-melanoma skin cancers caused by ultraviolet radiation. Non-ionizing radio frequency radiation from mobile phones, electric power transmission, and other similar sources have been investigated as a possible carcinogen by the WHO's International Agency for Research on Cancer, but to date, no evidence of this has been observed.
Studies with protons and HZE nuclei of relative biological effectiveness for molecular, cellular, and tissue endpoints, including tumor induction, demonstrate risk from space radiation exposure. This evidence may be extrapolated to applicable chronic conditions that are found in space and from the heavy ion beams that are used at accelerators.
Travel outside the Earth's protective atmosphere, magnetosphere, and in free fall can harm human health, and understanding such harm is essential for successful crewed spaceflight. Potential effects on the central nervous system (CNS) are particularly important. A vigorous ground-based cellular and animal model research program will help quantify the risk to the CNS from space radiation exposure on future long distance space missions and promote the development of optimized countermeasures.
Radiation exposure is a measure of the ionization of air due to ionizing radiation from photons. It is defined as the electric charge freed by such radiation in a specified volume of air divided by the mass of that air. As of 2007, "medical radiation exposure" was defined by the International Commission on Radiological Protection as exposure incurred by people as part of their own medical or dental diagnosis or treatment; by persons, other than those occupationally exposed, knowingly, while voluntarily helping in the support and comfort of patients; and by volunteers in a programme of biomedical research involving their exposure. Common medical tests and treatments involving radiation include X-rays, CT scans, mammography, lung ventilation and perfusion scans, bone scans, cardiac perfusion scan, angiography, radiation therapy, and more. Each type of test carries its own amount of radiation exposure. There are two general categories of adverse health effects caused by radiation exposure: deterministic effects and stochastic effects. Deterministic effects are due to the killing/malfunction of cells following high doses; and stochastic effects involve either cancer development in exposed individuals caused by mutation of somatic cells, or heritable disease in their offspring from mutation of reproductive (germ) cells.
Ionizing radiation can cause biological effects which are passed on to offspring through the epigenome. The effects of radiation on cells has been found to be dependent on the dosage of the radiation, the location of the cell in regards to tissue, and whether the cell is a somatic or germ line cell. Generally, ionizing radiation appears to reduce methylation of DNA in cells.
Sheldon Wolff was an American radiobiologist, cytogeneticist, and environmental health expert on mutagenic chemicals.
{{cite journal}}
: Cite journal requires |journal=
(help)Cancer incidence in a cohort of ~6250 people has been studied, and marginally raised levels of cancer in relation to assessed doses have been reported, although there are a number of uncertainties in the student, including lack of control for confounding factors such as smoking