Radiologic Dj 2012 Serial Killers

ABOUT US We value excellent academic writing and strive to provide outstanding essay writing services each and every time you place an order. We write essays, research papers, term papers, course works, reviews, theses and more, so our primary mission is to help you succeed academically. Most of all, we are proud of our dedicated team, who has both the creativity and understanding of our clients' needs. Our writers always follow your instructions and bring fresh ideas to the table, which remains a huge part of success in writing an essay.

Radiologic Dj 2012 Serial KillersRadiologic Dj 2012 Serial Killers

Fractal cancer - over 13 million hits in Google in Silicon Valley, 120 million hits in google.in early November. Fractal cancer - over 5 million hits in Silicon.

We guarantee the authenticity of your paper, whether it's an essay or a dissertation. Furthermore, we ensure confidentiality of your personal information, so the chance that someone will find out about our cooperation is slim to none. We do not share any of your information to anyone. Our Services When it comes to essay writing, an in-depth research is a big deal. Our experienced writers are professional in many fields of knowledge so that they can assist you with virtually any academic task.

We deliver papers of different types: essays, theses, book reviews, case studies, etc. When delegating your work to one of our writers, you can be sure that we will: • Use your writing style; • Follow your guidelines; • Make all the needed corrections whenever it’s necessary; • Meet even the strictest deadlines; • Provide you with a free title page and bibliography. We have thousands of satisfied customers who have already recommended us to their friends. Why not follow their example and place your order today?

The committee’s statement of task directed it to “identify current research directions in radiobiological science related to human health risks from exposures to low-dose ionizing radiation.” This chapter fulfills that directive by presenting an overview of the state of the science, major challenges and uncertainties, and research being pursued when the committee’s work concluded in late 2013. It is not a systematic literature review—a task beyond the scope of this study—but is instead a discussion of some recent developments in the field that highlight research directions. The focus is on research findings released after publication of two comprehensive reviews on the effects of low-level ionizing radiation: the National Research Council’s Health Risks from Exposure to Low Levels of Ionizing Radiation (known as the Biological Effects of Ionizing Radiation VII or the “BEIR VII” report (NRC, 2006) and the United Nations Scientific Committee on the Effects of Atomic Radiation’s (UNSCEAR’s) 2008 report to the General Assembly (UNSCEAR, 2010). As discussed in, this report defines low doses to be those in the range of 100 milligray (mGy) and below, an approach consistent with that used by International Commission on Radiological Protection (ICRP, 2007), the BEIR VII report (NRC, 2006), and UNSCEAR (2010). Doses on the order of 1 Gray (Gy) are referred to as moderately high; those on the order of 10 mGy, as very low. Discussions in the chapter focus primarily on low linear energy transfer (LET) ionizing radiation unless otherwise stated. _________________ Gray (Gy) is the International System of Units name for the unit of absorbed dose; 1 Gy is equal to 1 joule per kg or 100 rad.

INTRODUCTION Low levels of radiation can induce DNA damage. Cells respond to the direct radiogenic damage to DNA by attempting to repair it. If that repair is not perfect, the modifications induced may be transmitted to daughter cells and may lead to uncontrolled cell growth and consequently cancer.

More recent experiments have challenged the view that genetic alterations are restricted to directly irradiated cells by demonstrating that radiogenic damage can also be observed in cells that are not themselves irradiated (nontargeted effects) but are the progeny of an irradiated cell (radiation-induced genomic instability) or have received the damaging signal from the irradiated cell within the tissue microenvironment (radiation induced bystander effects). The exact mechanisms by which these nontargeted responses result in adverse health effects are not yet understood. However, the best understood bystander effect arises in a setting of chronic inflammation where the production of reactive oxygen species results in damage in adjacent cells (Kadhim et al., 2013; Morgan, 2003). Radiation effects at low doses are often described as stochastic—there appears to be no threshold below which an effect does not occur and the greater the exposure the higher the probability that an effect will occur, but the severity of the effect is independent of the dose received.

This is in contrast to tissue reactions (formerly called deterministic effects), which are based on tissue damage and whose severity increases as the dose increases. A deterministic effect typically has a threshold (of the order of magnitude of 100 mGy or higher) below which the effect does not occur. Although much is known about the risk of health effects after exposure to radiation at the 100 mGy–1 Gy dose range and high dose rates, there is greater uncertainty about the risk of those associated with exposure to low doses of radiation. Nevertheless, it is low doses that are of primary interest for radiation-protection purposes, such as making decisions on the dose limit for occupational exposure or the resettlement of populations after a radiological release event. Brenner and colleagues (2013) assert that setting radiation-protection standards too stringently in this dose range could have severe economic consequences, whereas setting them too low would present an unacceptable health burden. Low-dose radiation research involves observational studies of populations and experimental studies of radiation effects on molecules, cells, tissues, and animal models.

Epidemiological studies provide information about possible associations between exposure to low-dose radiation and disease in humans. Experimental studies improve our understanding of the _________________ The term “tissue reaction” has been adopted because some of these effects can be modified post irradiation rather than being solely determined at the time of irradiation (ICRP, 2012). Mechanisms by which low-dose ionizing radiation causes damage and how cells and tissues respond to that damage. As noted in, the U.S. Department of Energy (DOE) Low Dose Radiation Research Program began work in 1999 to investigate radiation effects on genomes and cells to better understand outcomes in living organisms and develop radiation protection standards based on risk. The original plan called for a 10-year effort funded at $22.4M/year in years 1–3, $25.6M/year in years 4–6, and $18.6M/year in years 7–10 (PNNL, 2012). More recently, however, the program’s funding has been reduced.

It no longer has its own line in DOE’s proposed budget; instead, low-dose research activities are now included in the Radiological Sciences/Radiobiology line of the Office of Biological and Environmental Research’s budget. The DOE FY 2015 Congressional Budget Request notes that overall funding for radiobiology was ~$6.2M for FY 2013, that the enacted total for FY 2014 was $3.2M, and that the FY 2015 request was for $2.4M (DOE, 2014). The accompanying text indicates that low-dose radiobiology research will “emphasize a systems biology approach to understanding the effects of low dose radiation on cellular processes and epidemiological studies to uncover statistically significant effects of low dose radiation in large populations” (DOE, 2014, p. It goes on to state that “[d]ecreases in radiobiology reflect a shift towards bioenergy and environmental research within the Biological Systems Science portfolio” (DOE, 2014, p. Research on low-dose radiation in the United States lacks a strategic agenda aiming to address a common set of scientific questions (Brenner et al., 2013).

An assessment of important scientific questions and research priorities, with emphasis on questions of greatest policy relevance for regulators involved in radiation protection, was developed over several years by the European Commission’s High Level and Expert Group (EC, 2009). This effort led to programs such as the Multidisciplinary European Low Dose Initiative (MELODI, 2014) and its Low Dose Research Towards Multidisciplinary Integration (DoReMi) research-integration component (DoReMi, 2014). A 2012 workshop conducted as part of this effort assessed the state of the art in research into the risk of low-dose radiation exposure and offered recommendations for updating its research agenda and a work plan for future research activities (Salomaa et al., 2013). When this report was completed, the most current update to the MELODI strategic research agenda was its fourth draft, published in March 2013 (Averbeck, 2013). In this chapter, discussions of scientific gaps and recent developments in the low-dose radiation research parallel the lines of research that the MELODI consortium believes to be the most promising in better understanding low-dose radiation health effects (Averbeck, 2013). First, the committee addresses some of the key factors that may modify low-dose radiation health effects.

It then discusses the primary approaches used to. Answer questions about such health effects—experimental studies and epidemiological studies—and provides insights on current research directions. LOW-DOSE RADIATION–RISK MODIFIERS The Shape of the Dose-Response Curve In the absence of a detailed understanding of human biological effects at low doses (less than 100 mGy) and low dose rates, radiation-protection standards and exposure limits that require an assessment of risks are set by assuming that there is a linear relationship between exposure and effect and by extrapolating from the outcomes observed at higher radiation levels.

This is called the linear no-threshold (LNT) model (ICRP, 1977; NCRP, 2001). Whether the risk of cancer actually diverges from the predictions made by using this model at low doses is disputed. Although U.S. And international bodies consider using the LNT approach to be prudent in the absence of direct evidence, the possibility of other shapes for dose-response curves at doses less than 100 mGy cannot be ruled out. As simplistically illustrated in, various potential extrapolations may be consistent with the data but would result in different risk estimates at low doses. For example, compared with an LNT extrapolation (curve a), a threshold (curve d) or beneficial-effect (termed “hormetic”; curve e) extrapolation would result in decreased risk at doses less than 100 mGy, whereas a downward-curving extrapolation (curve b) would result in increased risk.

With the exception of scenarios d and e, risk estimates assume that for any dose, no matter how low, there is a corresponding risk. In scenario b, the assumption of linearity would underestimate the risk, whereas in scenarios c and d, the assumption of linearity would overestimate the risk. National and international scientific groups such as the International Commission on Radiological Protection (ICRP), the National Council on Radiation Protection and Measurements (NCRP), BEIR, and UNSCEAR regularly review and currently endorse the use of the LNT model for assessing risk in a radiation-protection context.

However, other institutions, such as the French Academy of Sciences, rejected the LNT at low doses (Aurengo et al., 2005) on the basis of evidence for protective postirradiation cellular responses such as activation of DNA-repair systems and programmed cell death of damaged cells. This disagreement highlights the need for more _________________ The biophysical argument for the model is described later in this chapter in the “Mechanistic Models” section.

The logic and evidence underlying these extrapolations are described by Brenner and colleagues (2003). FIGURE 2-1 Potential extrapolated radiation risks at low doses in five different scenarios: curve a, linear extrapolation; b, downward curving (decreasing slope); c, upward curving (increasing slope); d, threshold; e, hormetic. SOURCE: Reprinted from Brenner et al., 2003. Copyright 2003, with permission from the National Academy of Sciences, U.S.A.

Research to characterize the health risks at low radiation doses with more certainty. Biological Effectiveness The biological effect of radiation differs by type, even when the absorbed dose is the same. High (i.e., densely ionizing) LET radiation has been shown to have different relative biological effectiveness (RBE) than low (i.e., sparsely ionizing)-LET radiation has. RBE, which is defined as the inverse of the dose of test radiation compared with the dose of reference radiation.

FIGURE 2-2 Relative biological effectiveness as a function of surviving fraction. SOURCE: Hall and Giaccia, 2005. Required to produce the same effect, is dependent on the dose level and the endpoint measured. For radiation-protection purposes, the relative effectiveness of each type of radiation is represented by a weighting factor that is used to convert the tissue-absorbed dose into the equivalent dose. These values, published by the ICRP, were last revised in 2007 (ICRP, 2007). One endpoint that may be considered is cell killing.

Illustrates RBE as a function of surviving fraction. The primary difference between the high- and low-LET radiation dose-response curves is the reduction of the curve’s so-called shoulder region with increasing LET. In the low-dose region of low-LET radiation, damage is due primarily to single tracks’ acting independently, and it is relatively easier for cells to repair this form of damage. In the high-dose region of the low-LET radiation curve, damage from multiple tracks more likely contributes to the nonlinear shape of the curve.

For high-LET radiation, the dose-response curve is more linear _________________ Surviving fraction is defined as the ratio of the number of viable cells (or cell colonies) after a given exposure to the number of cells or colonies irradiated. Because the damage is complex and less reparable than that from low-LET radiation.

Illustrates RBE as a function of LET. For cell killing, the RBE is easily quantified at high doses; differences in effects at low doses are more difficult to determine with confidence. In addition, the frequency of events in most cells is zero at low mean doses of high-LET radiation, with a few cells having only one event and the rare cell having more than one event.

This leads to large differences in energy-deposition distribution, which have been correlated to differences in RBE (UNSCEAR, 2000). In this case, non-targeted effects, particularly genomic instability, are of greater importance. High-LET radiation has been shown to be more effective at inducing transmissible instability than low-LET radiation is (UNSCEAR, 2012).

For low-LET radiation, the energy deposition remains fairly homogeneous at doses ranging about 10–100 mGy. However, it is still difficult to measure biological effects at doses lower than that. With respect to the induction of genomic instability, however, there appears to be a threshold of about 0.5 Gy for low-LET radiation (Koterov et al., 2005; Zyuzikov et al., 2011), indicating that this would not contribute to the development of health effects. FIGURE 2-3 Relative biological effectiveness as a function of linear energy transfer. SOURCE: Bushberg et al., 2002.

It is clear that high-LET radiation is more effective at causing cellular effects such as chromosome aberrations and mutations per unit dose. However, the number of DNA breaks produced per Gy is not very different for high- and low-LET radiation (UNSCEAR, 2000).

The differences are in the efficiency and fidelity of the DNA double strand–breaks repair and in the spatial distribution of the breaks. Thus, there is still much to be understood about the RBE at low doses of radiation and about how the RBE changes with decreasing dose. Dose and Dose-Rate Effects The role of dose rate in the biological effects of radiation is also a subject of much debate. Considerable evidence exists from molecular, cellular, tissue, and animal studies that induced damage decreases as the dose rate decreases, at least for X-rays and gamma rays (NCRP, 1980); however, epidemiological data do not consistently support this conclusion. Current radiation-protection practices often assume that cancer risks are lower at low doses and low dose rates than they are at high doses and high dose rates by a factor known as the DDREF (dose and dose-rate effectiveness factor). For many years, the recommended DDREF ranged between 2 and 10 (NRC, 1990). More recently, the BEIR VII report (NRC, 2006) recommended a distribution of DDREF values with a modal value of 1.5, whereas ICRP’s 2007 guidance on the control of exposure from radiation sources recommended a modal value of 2.0 (ICRP, 2007).

Animal studies at low dose rates or with internally deposited radionuclides have clearly demonstrated that protracted low-dose radiation exposure to tissues has fewer detrimental effects than the same dose delivered as an acute exposure (Brooks, 2011; NCRP, 1980; Vares et al., 2011). Studies of dogs and mice exposed daily to either low doses or equal doses given in a single exposure revealed that animals given the high dose-rate exposure were more likely to develop cancer compared to those that received the low dose-rate exposures (Carnes and Fritz, 1991; Carnes et al., 1998; Haley et al., 2011; Shin et al., 2011). In addition, gene expression studies in mice showed large variation in expression levels depending on exposure dose rates, suggesting that there are significant differences in the biological consequences of dose-rate factors (Uehara et al., 2010). For radiation-induced carcinogenesis in humans, however, the data are limited (Brenner, 1999), although recent studies in chronically exposed workers suggest a DDREF of about 1.0 (Jacob et al., 2009) At high LET, there is insufficient evidence of a diminution in cancer risk with decreasing dose rate. Indeed, for exposure to radon and its progeny (through alpha particles), evidence exists that radiation-induced cancer risks actually increase at low dose rates (Lubin et al., 1995; NRC, 1999).

The dose-rate effect is an important variable for understanding radiation risk. Reconciling the more than 60 years of radiation biology research with the epidemiological data remains a significant challenge for the future. External Versus Internal Sources of Radiation Scientific uncertainty also exists about the differences in tissue effects and therefore the risks from external versus internal radiation sources. Although currently, for radiation-protection purposes, an assumption is made that the effect is the same, independent of source location, it is understood that internal deposition of radionuclides is not as uniform as external irradiation is. The need for improved biokinetic and dosimetric models is crucial for making progress with this scientific question (EC, 2009). This issue is particularly salient for the military because of the potential for personnel to be wounded by depleted uranium (DU), a component of certain munitions and armor. EXPERIMENTAL STUDIES—CURRENT WORK AND RESEARCH DIRECTIONS Laboratory Studies Cancer development is a complex process.

In vitro studies have been valuable in the understanding of how radiation causes DNA damage, including changes in gene expression, DNA strand breaks, mutations, or chromosomal aberrations. However, no credible in vitro model exists yet for radiation-induced cancer. In addition, low-dose radiation studies that use DNA damage endpoints have not established the quantitative connection between any of these endpoints and radiation-induced cancer, which occurs long after exposure (Goodhead, 2009). Using Radiation Biology to Augment Radiation Epidemiological Studies Although experimental radiation biology cannot currently provide direct models for estimating low-dose radiation health risks in humans, there is much interest in using radiobiology to augment epidemiological studies (Preston et al., 2013). One example is the search for a “molecular _________________ A credible model would provide convincing quantitative and mechanistic evidence of an association between radiation-induced cancer and an endpoint such as gene expression changes, DNA strand breaks, or mutations. Brenner (2014) asserts that the most plausible in vitro model for radiation-induced cancer is oncogenic transformation but notes that it is not feasible to conduct studies of this effect at levels less than 1 Gy. Further, evidence in this exposure range comes primarily from animal models (Mancuso et al., 2008).

Fingerprint” of radiation-induced cancer within tumors: If radiation-induced cancers could be definitively identified, then stochastic models would be unnecessary, and the problems of statistical estimation of radiation risks might be largely eliminated. Suggestions have been made in this regard (Boltze et al., 2009; Dom et al., 2012; Hess et al., 2011; Taylor et al., 1994; Vahakangas et al., 1992), but none have been validated convincingly, although new high-throughput evaluation techniques show clear promise (Besaratinia et al., 2012). A second example focuses on sensitive subpopulations. Plausible evidence suggests that in an exposed population, a significant proportion of radiation-induced cancers may occur in genetically sensitive subpopulations (Flint-Richter and Sadetzki, 2007; Land et al., 1993). If this is the case and if rapid screening techniques could be developed to identify these radiation-sensitive individuals, epidemiological studies of low doses would be greatly facilitated. Efforts to identify individuals who are genetically susceptible to the effects of ionizing radiation are ongoing.

Not surprisingly, research to date indicates that carriers of adverse mutations in genes involved in the DNA-repair and cell cycle–control pathways may play an important role. However, thus far only rather rare mutations have been shown to have an appreciably heightened radiation effect. Given the role of BRCA1 and BRCA2 genes in the repair of DNA double-strand breaks and the high prevalence of breast cancer among women carriers, for example, investigators have hypothesized and observed that mutations in these genes enhance the radiation-associated increase in breast cancer after exposure to ionizing radiation from diagnostic X-rays (Andrieu et al., 2006). The Women’s Environment, Cancer, and Radiation Epidemiology (WECARE) study examined the effect of radiation exposure on the etiology of second breast cancers in women treated with radiation for an initial breast cancer (Malone et al., 2010). The investigators found that women carriers of the BRCA1 and BRCA2 mutations had 4.5- and 3.4-fold increased risks of contralateral breast cancer, respectively, compared with noncarrier women. Biomarkers of Dose or Effect A large range of biological marker assays can be used to determine the dose received by an individual.

These assays have the advantage of measuring the biological effect of the doses that can take individual susceptibility into account. The most widely accepted techniques are cytogenetic, with the dicentric chromosome assay (DCA) currently the technique of choice in the case of recent acute exposures (IAEA, 2011). In general, cytogenetic assays tend to be the most widely used because of their sensitivity and specificity to ionizing radiation. Other techniques, each having advantages in different. Scenarios (IAEA, 2011), include cytokinesis-blocked micronucleus assay (CBMN), fluorescence in situ hybridization (FISH), and premature chromosome condensation (PCC). One limitation of cytogenetic assays (except the PCC) is the requirement to culture cells for at least 48 hours before preparing the samples for an analysis that is itself laborious and requires expertise. When analyzed on microscopy, these assays give accurate dose estimates, but they cannot be performed quickly enough to be useful after a mass-casualty radiation event in which the dose may need to be determined for hundreds to thousands of exposed individuals.

Over the years, many efforts have been made to automate the existing cytogenetic assays using metaphase-finding platforms, flow cytometry, imaging cytometry, and other high-throughput systems; for example, the Rapid Automated Biodosimetry Tool (RABiT) (Vaurijoux et al., 2009). Many other biomarkers of radiation exposure have also been examined with the hope of developing a faster biodosimeter that could be used in the field. These can be grouped as genetic markers, hematological markers, protein markers, and physical markers. Many of the methods in use today, along with their respective lowest measurable doses, strengths, and limitations, are presented in.

Systems Biology Systems biology and systems radiation biology are young but rapidly evolving fields that investigate the complex interactions of cells within tissues and within organisms in the context of the whole irradiated system rather than the response of the individual cell. The ultimate goal is to provide a mathematical description of radiation-induced carcinogenesis across all dose levels. Biological systems are by nature complex, and current radiation modeling is limited to simplistic models that do not fully reflect the inherent biological diversity of humans or the mechanisms that underlie risks to the population (Barcellos-Hoff, 2008). Indeed, the fate of irradiated cells within a tissue is strongly modulated by their interactions with one another and with the tissue microenvironment (Barcellos-Hoff, 2005). Linking mechanisms of cellular and molecular responses to radiation exposure with macroscopic processes at the tissue, organ, and whole-organism levels would improve our ability to predict cancer risk in response to irradiation (Dauer et al., 2010).

Although much is known about the DNA damage induced by the deposition of energy from exposure to ionizing radiation and the subsequent cellular responses, a significant gap exists in the understanding of how these might lead to detrimental health effects years after exposure. TABLE 2-1 List of Existing Biodosimetry Methods Method Sensitivity (Gy) Strengths Limitations Cytogenetic DCA 0.1 Partial body; sensitivity and specificity to radiation 48-h culture time; 3-month half-life of dicentrics CBMN 0.2 Easy to score and automate 72-h culture time; cannot identify partial body exposures FISH 0.25 Retrospective High background, expensive PCC 1 High dose range Poor sensitivity Genetic Glycophorin A 1 Fast (3 h for dose estimate) Present in 50% of population Gene expression 0.1 Fast Transient response Hematological Cell counts 1 Fast (. Advances in high-throughput “-omics” technologies, however, enable interrogation of radiation-induced effects at the genomic, transcriptomic, proteomic, and metabolomic levels. The systems biology-level challenge is to integrate these -omics platforms to understand radiation dose and dose-rate effects in a complex tissue as a function of time and then to relate these effects to observable functional physiological or physical changes in that tissue after irradiation. Addressing this problem requires knowledge of the biological processes involved, such as the molecular and biochemical signaling pathways, and computational-modeling capabilities to organize this information in a way that it can be integrated with radiation-risk models.

Mechanistic Models At present, the only approach to risk estimation at low doses involves models for interpolation across a range of observed doses to derive estimates of risk at lower doses. Risk estimates at low doses may be highly sensitive to model choice. At doses greater than 1 Gy, cells are typically hit by many tracks of radiation, whereas for low doses of low-LET radiation (i.e., at doses less than 100 mGy), most cells are hit by a single track; the lower the dose within the low-dose region the smaller the number of cells hit. That is, in the low dose region the damage to cells hit remains essentially the same (i.e., a single radiation track passing through a cell). What changes is the number of cells that are subjected to this damage, which decreases proportionally with decreasing dose.

This is the biophysical argument for the LNT model. Many caveats are associated with this biophysical argument. Examples include the potential effects of phenomena such as intercellular communication and immunosurveillance and the possibility of different radiobiological processes at very low doses (less than 10 mGy), compared with doses at the 10–100 mGy range (Brenner, 2009; Tubiana et al., 2006).

However, currently there is no scientific knowledge as to whether or how these phenomena and processes would deviate cancer risks at very low doses from those predicted by linear extrapolation from low doses (Brenner et al., 2003). _________________ The suffix “-omics” refers to the study of the genome, proteome, or metabolome; hence genomics, proteomics, and metabolomics. Typically, -omics involves analysis of large databases of information and requires that investigators with different areas of expertise (e.g., computer scientists and biologists) work together to best interpret the data.

EPIDEMIOLOGICAL STUDIES—CURRENT WORK AND RESEARCH DIRECTIONS Epidemiological radiation studies become increasingly impractical at low doses because of the large sample sizes required, because the background cancer rate is high enough that the study’s signal-to-noise ratio becomes prohibitively low. To estimate cancer risks at low doses (and potentially more chronic exposures), one then needs to extrapolate radiation-related risks from epidemiological studies at higher doses (Puskin, 2008). In any epidemiological study, investigators need to carefully examine whether an observed association is real or results from bias. Well-designed epidemiological studies aim to minimize bias problems or at least to characterize their potential influence.

The approaches taken to minimize bias in low-dose radiation epidemiology vary, in part depending on decisions about the study design, statistical methods, and opportunities available to investigators for data collection. Studies aiming to estimate risks at low radiation doses deal with challenging problems related to design, analysis, and interpretation of the study findings. These include • Measurement errors related to estimation of radiation doses and uncertainties in dosimetric models, • Confounding, which is related to factors other than radiation that cause the outcome of interest, and • Model uncertainty in selection of dose-response models. The NCRP (2012) and others have provided comprehensive reviews of the sources of bias and uncertainty in radiation epidemiological studies. The three most commonly used types of epidemiological studies designed to answer questions about low-dose radiation are cohort studies, case-control studies, and ecological studies. The following sections discuss those and risk-projection models. Epidemiological Study Designs Cohort Studies Cohort studies typically follow a group of exposed and a group of unexposed individuals over time to determine disease occurrence in relation to radiation exposure.

Much of what we know about cancer risks at lower radiation levels comes from the study of the atomic-bombing survivors of _________________ Bias is an error in the measurement of a factor that can arise, for example, from choosing study subjects that are not representative of the overall population (i.e., selection bias) or from inaccurate or incomplete information on the disease or exposure (i.e., information bias). Hiroshima and Nagasaki, Japan. The Life Span Study (LSS) cohort, the main cohort of these survivors, includes about 94,000 atomic-bombing survivors who were within 10 km of the bombs’ hypocenter and about 26,000 individuals who were not in either of these two cities at the times of the bombings. The Radiation Effects Research Foundation and its predecessor, the Atomic Bomb Casualty Commission, tracked cancer incidence (Preston et al., 2007), cancer mortality (Ozasa et al., 2012), and non-cancer-related mortality (Neriishi et al., 2012; Ozasa et al., 2012; Shimizu et al., 2010) rates in the LSS cohort. Estimates of doses to cohort members reflect major efforts to retrospectively collect data on survivors’ locations and degree of shielding at the time of the bombings (Cullings et al., 2006). Dosimetry systems have been recently revised to derive exposure estimates for many (but not all) members of the cohort. Since the inception of the LSS, the investigators involved have been concerned with issues of confounding, measurement error, and selection bias.

To be part of the LSS, individuals must have survived for at least five years after the bombings, an inclusion criterion that may be related to the ability to survive acute effects (Pierce et al., 2007). Exposure was determined in part by each person’s location and shielding conditions, and one concern is confounding of the radiation effect by other disease risk factors related to each individual’s location and shielding conditions at the time of the bombing. Proximity to the hypocenter in Hiroshima, for example, was not only related to radiation exposure from the bomb but also potentially related to each individual’s occupation, socioeconomic status (SES), and other independent risk factors for mortality that simultaneously can be correlated with dose. Another concern is exposure misclassification caused by uncertainty in determining the survivors’ location and shielding. Studies have varied in their use of self-reported information to classify survivors with respect to distance from the hypocenter, level of shielding, and the presence of acute radiation effects such as hair loss. Selection bias may arise because of the effect of conditioning on survival; this may induce associations between location (and therefore exposure) and risk factors that were not confounders in the base population. Selection bias may also influence generalizability if the survivors differ.

Further, in an observational study, the exposure may be seen to have both direct and indirect effects; for example, the experience of surviving the atomic-bomb blast may not simply have direct effects related to the radiation dose but may also have indirect effects because the experience could lead to changes in health-related behaviors. Case-Control Studies Case-control studies aim to determine whether the frequency of exposure is higher in the group of people with the disease of interest (the cases). Than it is in the group without the disease (the controls).

As in the case of cohort designs, confounding, measurement error, and selection bias are concerns in case-control designs. In some situations, case-control designs allow collection of more detailed information on covariates than would be feasible to assemble for a large cohort; this may help to address concerns about confounding by ascertaining information on a longer list of potential confounders of concern. A disadvantage of case-control designs is the potential for different recall of exposure information by case and control subjects. In studies in which self-reported exposure information is used as the basis for classifying people with respect to exposure, this may introduce bias. Such concerns were raised with respect to interpretation of findings from the Oxford Survey of Childhood Cancer (OSCC), a highly influential case-control study on low-dose ionizing radiation effects (Doll and Wakeford, 1997; NCRP, 2013). One way to address these concerns is to rely on information recorded before diagnosis of the case or symptom onset.

For example, OSCC investigators were able to draw on medical records as an alternative to self-reporting as a source of information about in utero X-ray exposure for many of the case and control subjects. Differences in participation between the cases and controls may also lead to bias.

This would occur, for example, if most cases agreed to participate in a study but most lower SES controls did not, and SES was related to the distribution of exposure in the study population. Ecological Studies Ecological studies (also called geographical studies) are those in which exposures and outcomes are assessed for groups of individuals within neighborhoods, counties, or other geographical units rather than at the individual level. Ecological designs have been used for the study of protracted Low-Level exposure to ionizing radiation (Little et al., 2010b). As with other study types, problems of confounding, measurement error, and selection bias are important considerations. A unique problem of inference arises with ecological studies, compared with cohort studies: the possibility of errors when the findings from ecological designs are used to draw conclusions about the existence of causal associations at the individual level.

This is referred to as ecological fallacy. _________________ The ecological fallacy arises when the relationships observed for groups are used to draw conclusions about individuals.

For example, it has been shown that countries with high fat intake also have high rates of breast cancer. This does not necessarily imply that women with high fat intake are at higher risk of developing breast cancer (Holmes et al., 1999). Risk-Projection Models Risk-projection models are used to predict excess cancer risks by combining population-dose estimates with existing risk coefficients, thus transferring risks across populations with different baseline rates.

Recent applications of risk-projection modeling have increased for two main reasons: their provision of timely estimates by making use of the risk estimates for U.S. Populations published in the BEIR VII report (NRC, 2006) and the increasing acceptance of the limitations of epidemiological studies of low-dose radiation exposures (Berrington et al., 2011). However, this approach may lead to erroneous conclusions in several ways. First, the insights drawn from a historical epidemiological study may provide a distorted, or biased, model for projecting an exposure’s effect on disease occurrence in the study population. This is called a problem of internal validity. Such problems might result from confounding of the estimate due to an imbalance in the distribution of disease risk factors other than radiation, errors in historical estimates of radiation exposures, or health-related selection into the exposure groups. Second, the population that one wishes to draw inferences about may not be comparable with the historical population that was studied; this is considered to be a problem of external validity.

Differences in susceptibility, exposure to other hazards (for example, smoking), or access to medical and preventive care may work against a valid extrapolation of radiation-risk estimates from a historical study population to the contemporary population of interest because of differences in baseline cancer rates. Cancer Risk at Low Doses A common comment about the atomic-bombing survivors’ LSS cohort is that it is a high-dose group, and one thus needs to extrapolate radiation-risk estimates to address questions about the effects of lower doses. In fact, Preston and colleagues (2007) noted that about 85% of the cohort received estimated radiation doses to the colon less than 200 mGy. When one focused only on the cohort members exposed to low doses, there was evidence of increased solid-cancer incidence when the analysis was restricted to LSS cohort members who received doses of 150 mGy or less (Preston et al., 2007).

Evidence was less strong when the analysis was restricted to cohort members who received doses of 100 mGy or less. Email Tarantula Keygen Idm. Similar results were seen for cancer mortality. Epidemiological studies on populations exposed to still lower doses (with presumably correspondingly lower radiation risks) are likely to yield more uncertain results. An approach to epidemiological assessment of risks at such lower doses is to focus on scenarios in which the signal-to-background ratio is likely to be. Fusion Io Drivers Esxi Commands. One example is the study of childhood cancers after in utero medical diagnostic imaging. Here, the excess relative risk among the exposed subjects (the signal) is expected to be high because they were exposed at a critical point in their development, and the background is expected to be low because childhood cancers are rare. Indeed, the OSCC detected a significant increase in childhood cancer risk for a mean dose of ∼10 mGy (Doll and Wakeford, 1997).

The results from this large study and others showing an association between in utero exposure and cancer risk in childhood (IARC, 2012) are widely accepted and have changed medical practice related to exposure of pregnant women to ionizing radiation. However, NCRP (2013) questioned whether the relationship is causal. The same logic—focusing on subgroups of the population in which the excessive relative risk of cancer after radiation exposure is posited to be largest—applies to two more recently published epidemiological studies of cancer risks associated with pediatric exposure to computed tomography (CT) scans, both of which had a relatively short mean follow-up of ~10 years (Mathews et al., 2013; Pearce et al., 2012). The relatively short follow-up after pediatric exposure permits detection of radiation-induced cancers with short latency while excluding investigation of those cancers that may appear at later ages. Both of these studies had large cohorts, and both showed a statistically significant association between the number of CT scans and increased cancer risk. Pearce and colleagues concluded that in children, the use of CT scans that deliver cumulative doses of ~50 mGy might triple the risk of leukemia, and doses of ~60 mGy might triple the risk of brain cancer. Mathews and colleagues reported finding a statistically significant dose-response relationship over the range of zero to more than three CT scans, and the cancer incidence–rate ratio increased by 0.16 (95% confidence interval [CI]: 0.13–0.19) for each additional scan.

One concern that has been raised is the possibility of confounding by indication (NCRP, 2012); i.e., a factor that prompted the CT scanning might itself also be a risk factor for the subsequent cancer (reverse causality). For example, an undiagnosed brain tumor might have brought about the symptoms that led to the need for CT scanning and is itself a cause of later increased risk of brain cancer. The investigators of both studies acknowledged this potential for confounding and attempted to address it by conducting analyses that discounted cancers in the period 5 or 10 years immediately after the CT scanning.

The fact that associations between CT radiation dose and childhood cancer were observed many years after scanning diminishes the plausibility that latent disease led to the need for CT scanning. However, confounding by indication remains a concern that has been raised about potential bias in these studies. Other CT-specific issues _________________ Note that the absolute risk of childhood cancer is low, even among those exposed in utero.

Include the potential for undocumented rescans necessitated by patient movement during the initial examination (NCRP, 2012). Of course, radiation-related excess cancer risks may persist for many decades after exposure (Preston et al., 2007). To date, studies of cancer development after pediatric CT scanning have not encompassed sufficient follow-up time to permit estimates of lifetime excessive cancer risks. Recently, Boice and colleagues launched a study that aims to provide information on risks when a radiation dose is received gradually over many years (Boice, 2012). The study, known as the One Million U.S.

Radiation Worker Study, focuses on the following occupational groups with differing radiation exposure patterns: uranium workers at multiple DOE locations (196,000); DOE plutonium workers (155,000 participants); nuclear weapons test participants (atomic veterans) (120,000 participants); nuclear power plant workers (236,000 participants); and industrial radiographers, radiologists, and other medical practitioners (300,000 participants) (Boice, 2011). Still at its initial stages, the study may help inform radiation protection standards for chronic exposure. Other Health Consequences Radiation exposure at doses higher than 1 Gy is associated with many noncancer health effects, but little existing evidence addresses such effects at doses less than 1 Gy. Many questions remain about biological mechanisms operating at moderately high to low (i.e., at the 100 mGy–1 Gy range) doses and below that might affect the risk of disease.

For example, the association between low-dose radiation exposure and circulatory diseases is increasingly gaining attention (ICRP, 2012). However, the fact that established risk factors for circulatory disease (such as smoking, diet, obesity, diabetes, and elevated blood pressure) that could act as confounding influences prevents confident interpretation of the reported statistical associations (Little et al., 2008, 2010a; Ozasa et al., 2012). Recent studies of cohorts including atomic-bombing survivors (Shimizu et al., 2010), the Mayak radiation workers (Azizova et al., 2012), Chernobyl emergency workers (Ivanov et al., 2006), and nuclear workers (McGeoghegan et al., 2008) have shown an increase in circulatory disease at doses higher than _________________ Before more modern technology was implemented, individual CT scan exposures could last for several seconds, leading to high exposures on repeated scans. The Mayak worker cohort is a collaborative American–Russian study of workers employed in the plutonium and reprocessing plants in the Chelyabinsk region of the Russian Federation that started in the late 1940s. Workers in the reactors and in the plutonium and radiochemical plants were exposed to external gamma radiation during their employment at Mayak and the radiochemical plants (Shilnikova et al., 2003). Informed by the results of these recent studies, ICRP listed circulatory disease as a radiation health effect (ICRP, 2012).

Additionally, the eye lens is relatively radiosensitive, and cataract formation has been associated with exposure to ionizing radiation for decades (ICRP, 1991). Recent studies provided a strong scientific basis for ICRP’s revision of standards for protection of the eye, which lowered the allowable acute dose to the eye from 5 Gy to 0.5 Gy (ICRP, 2012). A reanalysis of atomic-bombing survivor data suggests a linear dose response for cataract induction, with an estimated threshold of 0.1 Gy (Neriishi et al., 2007) and a radiation effect for vision-impairing cataracts—that is, clinically significant cataracts that require surgery—at doses less than 1 Gy (Neriishi et al., 2012). Data from the U.S. Radiological technologists study, one of the largest undertaken to date on cataract risk, suggested an increased risk of cataracts arising from cumulative doses of a few tenths of milligray (Chodick et al., 2008).

Further support for cataract formation after exposure to less than 1 Gy comes from studies of commercial airline pilots and Chernobyl clean-up workers (Worgul et al., 2007). SUMMARY AND OBSERVATIONS The committee’s examination of current directions in radiobiological science related to the human health risks from exposure to ionizing radiation led it to several findings and conclusions. Although much is known about the health effects after exposure to radiation at the 100 mGy–1 Gy dose range and high dose rates, the scientific uncertainty concerning the effects of low-dose radiation is considerable. Debate continues about how to extrapolate radiation risks at low doses, the biological effectiveness of low-dose radiation, and the effects of dose rate and external versus internal exposure. Nevertheless, it is exposures at low doses that are of primary interest for radiation-protection purposes, and decisions are needed for use in setting standards for protection of individuals against the adverse effects of low-dose radiation. The health effect of primary concern in the context of low-level radiation is cancer. However, recent data suggest that other effects, such as cardiovascular diseases and cataracts, may occur after exposure at lower doses than previously thought, and this has led to changes in international recommendations for radiation protection.

Low-Dose radiation research involves both experimental studies of radiation effects on molecules, cells, tissues, and animal models and observational studies on populations (i.e., epidemiological studies). Although epidemiological studies can provide the most direct evidence of associations between exposure to low-dose radiation and disease in humans, experimental studies are essential to improving our understanding of the mechanisms.

Berrington de Gonzalez, A., A. Morton, and P.

Evolving strategies in epidemiologic research on radiation and cancer. Radiation Research 176(4):527-532. Besaratinia, A., H. A high-throughput next-generation sequencing-based method for detecting the mutational fingerprint of carcinogens.

Nucleic Acids Research 40(15):e116. The One Million U.S. Radiation Worker Study. Department of Energy Low Dose Radiation Research Program Investigators’ Workshop. Bethesda, MD. (accessed April 11, 2014).

A study of one million U.S. Radiation workers and veterans (The Boice Report #6). Health Physics News. (accessed April 11, 2014). Boltze, C., A. Wiest, and M. Sporadic and radiation-associated papillary thyroid cancers can be distinguished using routine immunohistochemistry.

Oncology Reports 22(3):459-467. Does fractionation decrease the risk of breast cancer induced by low-LET radiation? Radiation Research 151:225-229. Extrapolating radiation-induced cancer risks from low doses to very low doses. Health Physics 97(5):505-509. What we know and what we don’t know about cancer risks associated with radiation doses from radiological imaging. British Journal of Radiology 87(1035).

Setlow, and M. Cancer risks attributable to low doses of ionizing radiation: Assessing what we really know.

Proceedings of the National Academy of Sciences of the United States of America 100(24):6. Adelstein, and J. Letter to John P.

Holdren, Director of the White House Office of Science and Technology Policy, on the future of low-dose radiation research in the U.S. (accessed February 27, 2014). Is a dose and dose-rate effectiveness factor (DDREF) needed following exposure to low total radiation doses delivered at low dose-rates? Health Physics 100(3):262. Seibert, and E.

The essential physics of medical imaging, second edition. Philadelphia, PA: Lippincott Williams & Wilkins.

Responses of the beagle to protracted irradiation. Effect of total dose and dose rate.

Radiation Research 128(2):125-132. Olshansky, and D. An interspecies prediction of the risk of radiation-induced mortality. Radiation Research 149(5):487-492. Chodick, G., N. Bekiroglu, M. Hauptmann, B.

Alexander, D. Weinstock, A.

Bouville, and A. Risk of cataract after exposure to low doses of ionizing radiation: A 20-year prospective cohort study among US radiologic technologists. American Journal of Epidemiology 168(6):620-631. Dose estimation for atomic bomb survivor studies: Its evolution and present status.

Radiation Research 166(1):219-254. Stram, and P. Review and evaluation of updated research on the health effects associated with low-dose ionising radiation. Radiation Protection Dosimetry 140(2):103-136.