zebrafish
42 views | +0 today
Follow
Your new post is loading...
Your new post is loading...
Rescooped by Yonghua Sun from Ag Biotech News
Scoop.it!

“Zero” in terms of food policy and risk perception - Matsuo & Yoshikura (2013) - Food Policy

“Zero” in terms of food policy and risk perception - Matsuo & Yoshikura (2013) - Food Policy | zebrafish | Scoop.it

In the present article, the concept of “zero” is discussed using three examples, chemical contaminants/residues in foods, radionuclides, and GM foods. These examples share several common features. First, the risk reduction is generally quantified as the “fold” reduction, i.e., risk reduction by twofold, 10-fold, etc., as no other convenient parameters are available. Second, there are situations where “zero” virtually does not exist, such as in the case of toxicants in GM plants or of radionuclides, because the background always exists. Third, assuring “zero” is often impossible on technological and other reasons. This shows the critical importance of understanding and communicating “zero” in terms of food policy.

 

Despite the fact that risk managers employ various approaches to manage food safety issues, they often follow an overcautious “zero tolerance” policy rather than a more risk-based approach. This cautiousness is largely the result of pressure from consumers and politicians and/or a fear of unexpected socio-economic consequences that may compromise their position. In assessing risks and communicating the results of risk assessments in such a situation, it is critical to understand what “zero” means.

 

Risk reduction is generally considered in terms of “fold reduction” as there is no other convenient parameter to express the risk reduction quantitatively… “fold reduction” is endless from its nature and there will be no “zero”… How to communicate these issues is important in the risk communication.

 

Under the current risk analysis framework, the acceptable daily intake (ADI) of a chemical hazard, such as a contaminant, food additive, or pesticide residue in food is calculated first by determining the non-observable adverse effect level (NOAEL)… The ADI is usually obtained by dividing the NOAEL by default “uncertainty factors” or a “safety factor of 100” to extrapolate animal testing to humans while taking into account differences among individuals to assure a safety range. The maximum level (ML) allowed in a food or the maximum residue level (MRL) in the case of pesticides and veterinary drugs is the regulatory risk management standard determined so as to contain the total consumption within ADI by taking the food’s consumption pattern (basket survey) into account… 

 

As the ADI is set 100-fold lower than the NOAEL, the notion of “fold reduction of risk”… is built in the concept of ADI itself… The fold reduction can be expressed appropriately in the logarithmic scale (the upper panel), where the twofold risk reduction in situations A and B is continuously visible and its size remains the same. However, the actual level of toxicants can be expressed also in the normal scale (lower panel). In the normal scale, the toxicant levels become infinitesimally close to zero. In other words, while the normal scale clearly shows that the twofold reduction from the level of ADI or from the level of ADI set 1000-fold below the NOAEL is insignificant, in the logarithmic scale, twofold risk reduction from the ADI and that from ADI set 1000-fold lower than NOAEL are clearly visible and they are of the same size. It should be noted that only with the thinking in the logarithmic scale the risk reduction by the “fold reduction” can be appreciated.

 

To complicate the situation further, consumers often considers that a slight upper deviation from the ADI level (measurement in quantity, which is intrinsically in the normal scale) will pose health risks while neglecting the fact that the ADI is already set 100-fold lower than the observed NOAEL… The above arguments may appear unnecessary useof sophistry, but we encounter the similar situation in many examples as presented below…

 

The determination of the appropriate level of protection (ALOP) includes many approaches that a risk manager may use to develop risk management options…

(1) The “notional zero risk” approach, under which the level of negligible risk (the “notional zero risk”) is predetermined by conducting a routine risk assessment; this approach is most widely used in setting MLs (as described above) for chemical hazards in food.

(2) The “as low as reasonably achievable (ALARA)” approach is usually taken when managing food hazards that cannot be entirely eliminated (e.g., microbiological pathogens), but need to be minimized because they still have an adverse health effect; this hazard is often limited to the level that is technically and/or economically achievable.

(3) The “threshold” approach sets a standard to comply with a predetermined specific numerical level set in public policy. An example of this approach is illustrated by the addition of food coloring and food additives in US foods. The amount of added color and other food additives must be kept below the level that induces an additional cancer risk in 1/1,000,000 consumers…

(4) The “benefit-cost” approach deploys a risk assessment, as well as a benefit-cost analysis, to strike a balance between the expected benefit of risk reduction and the monetary cost of the required measures.

(5) The “comparative risk” approach requires that a comparative analysis be made between the risk and the countervailing risk of avoiding such a risk. For example, the risk of methyl mercury in fish is weighed against the nutritional benefit of consuming fish.

(6) The “precautionary” approach is taken when there is information that suggests that there is a hazard that poses a significant adverse health effect, but where there is insufficient data regarding the hazard to ban it.

 

None of these approaches aim to achieve “zero” risk. Rather, they try to offer a framework for taking the most appropriate measures, recognizing that there is always a grey zone associated with the placement of a specific numerical standard. Numerical standards are based on the outcome of choices made when considering a safety range, suggesting that there should be some flexibility in the established measures. Nevertheless, many consumers often reject the presence of a grey zone, and pursue “zero-risk”. Thus, there is an inherent tension between the consumers’ pursuit of “zero risk” and the more practical, risk-based decisions made by regulators.

 

Regulator’s dilemma… foods containing a regulated substance at a level above the ML may be accidentally placed on the market. If such a food is marketed in very small amounts, the consequent risk, from a public health viewpoint, is negligible. In such cases, health officers often try to convince the public that there is no harm from a public health view point. However, those who have actually consumed the contaminated foods are never satisfied because they have eaten food containing a “toxic substance as twice as much as the ML”. Those who have consumed the food may claim that the health effect may appear later in life. This situation cannot be easily and convincingly explained to the consumers by the regulators, and a poorly prepared risk communication can increase consumer ire…

 

If country A chooses an ADI of 0.3 mg/kg/day and country B chooses an ADI of 0.15 mg/kg/day, people will assume that the measures adopted by country A carry twofold more risk than those adopted by country B. They would not consider that country A’s measure is only 0.15 mg/kg/day higher than country B’s, which corresponds to only 0.5% difference (0.15/29) in terms of the NOAEL (29 mg/kg/day). A difference of 0.5% is within the error range of the NOAEL measurement…

 

Following the March 2011 tsunami in the Tohoku area of Japan, an accidental release of radioisotopes from the Fukushima Nuclear Electric Power Station resulted in the government’s implementation of provisional radionuclide regulations. However, in response to consumer anxiety, some food retailers adopted voluntary measures that required their products to meet even lower maximum isotope levels… These voluntary initiatives were praised by consumers and the media, but at the same time, caused public confusion regarding the government’s safety standards…

 

Precision of technologies and “zero” tolerance: For toxins, when a low-sensitivity detection method is used, the results may show that the toxin is absent (equal to “0”), i.e., “false negatives”. However, methodological improvements may reveal the presence of the toxin. Later identification of the presence of toxins by an improved technique is a challenge to regulators when they adopt “zero” tolerance policies.

 

“False positives” are the opposites of “false negatives”. During the aftermath of the terrorist attack on the United States, the discovery of anthrax spores, sent through the postal system, resulted in increased public fear. Rapid detection machines were developed to check mail for the presence of anthrax. The machines ended up as complete failures because the mail-handling system stopped every few minutes due to the detection of false positives… Even the best apparatus with a 99.9% accuracy rate will result in one false positive per 1000 envelopes. If a post office processes one million letters per day, the mail-handling system would stop every minute, even with such a high degree of accuracy…

 

As mentioned above, a supermarket chain announced that it aimed for “zero” radionuclides in its products in response to fears caused by the radiation release following the Japanese tsunami. In reality, such a target is hard to be achieved due to the background level of natural radiation present in food. The quest for “zero” tolerance is an illusion…

 

The “zero tolerance” policy surrounding GM foods is entirely different from the zero tolerance associated with chemicals or radiation exposure…

 

A topic related to “zero tolerance” was found in the risk assessment of GM foods. In the early 1990s, the Organization for Economic Co-operation and Development (OECD) proposed the concept of “substantial equivalence” for food safety... The concept, in short, states that a comparative safety assessment should be conducted using a non-modified food, a conventional counterpart, as a reference. The concept suggests that past experience with the consumption of a particular food is a guide to the determination of whether a GM food is as safe as the conventionally produced food. This concept does not attempt to pursue the absolute safety of a food. The reason for this approach is the recognition that there is no “zero” risk for foods. All plant species produce certain toxins such as solanine in potatoes to defend themselves from insects and other predators. Unbalanced food intake, or the excessive intake of calories, will result in metabolic disorders or diseases…

 

As the volume of GM foods (mainly grains) increases in international trade, GM grains that have not been authorized for human consumption have accidentally been comingled in food and feed supplies… These cases caused conflicts between GM-exporting and importing countries. Different approaches were taken in different countries upon discovery of these various, accidental introductions of GM food. For example, in the case of GM foods containing Bt10… some countries, like the US, followed a risk-based approach that was based on the findings of the Environmental Protection Agency. The findings indicated that the Bt10 protein was safe and Bt10 corn constituted an extremely small amount of the overall food and feed supply… Other countries, like Japan and the European Union (EU)… chose a “zero tolerance” approach (i.e., did not permit the consumption of the unauthorized GM food unless a full risk assessment was completed) and ordered the consignments to be discarded upon discovery of traces of Bt10 at the harbor…  

 

As a result, the Codex task force produced a guideline on the risk assessment for the low-level presence of foods derived from modern biotechnology… The guidance document provides a risk assessment for low-level presence situations and the provision of information regarding approved GM varieties, by the exporting countries, on the FAO web site…

 

A lengthy debate took place over how to approach GM labeling. Here, again, countries differed in their approaches. The US Food and Drug Administration’s approach to GM foods is essentially the same as its approach to foods developed using traditional plant breeding… wherein labeling is not required unless there is a significant change in the nutritional value or allergenicity of the foodstuff. Other GM crop-producing countries, like Canada and Argentina, have taken a similar approach to GM food labeling.

 

On the other hand, the EU, Japan, Australia, and others require labeling of GM food, in the context of “consumer choice” and “consumer right-to-know” policies. These countries have set thresholds for the unintended or adventitious presence of GM foods (the EU’s threshold is <0.9%; Japan, <5%; and Australia, <1%). The same questions regarding detection also arise with regard to these thresholds. What a consumer believes to be “zero” is not always zero, as there is always a possibility of GM foods being present that pass the established detection test(s).

 

Other claims of “zero tolerance” have been based on the belief of individuals in ethical concerns such as “playing God should not be allowed”… “GM allows multinational companies to monopolize global agriculture”… Certain religions have very strict prohibitions on the consumption of specific meats, raising questions on whether or not faithful adherents will be able to consume transgenic meat, such as fish with bovine or porcine genes. However, because the genes of vertebrates share substantial sequence homologies, the fish, whether transgenic or not, already has genes with sequence homology to the corresponding pig, cow, or human genes. In other words, people consuming an ordinary fish are already consuming proteins at least partially homologous to those of pigs or cows. At the gene level, “zero tolerance” is impossible.

 

The above arguments illustrates how the concept of “zero” is deeply seated in the concept of risk, and how it is difficult to communicate the concept of “zero” to consumers. It is important to remind consumers of the reality that pursuit of “zero” never leads us to “risk zero”. Efforts to reach zero can increase aggregate risk levels through shifting risk from one place to another. 

 

http://dx.doi.org/10.1016/j.foodpol.2013.08.012


Via Alexander J. Stein
more...
Alexander J. Stein's curator insight, September 15, 2013 4:44 PM

An excellent explanation, and the entire article is well worth a read. (Unfortunately it's behind a pay-wall.) 

Rescooped by Yonghua Sun from SynBioFromLeukipposInstitute
Scoop.it!

In the Future, We'll Program Cells Like Computers

In the Future, We'll Program Cells Like Computers | zebrafish | Scoop.it
Researchers are developing a method to program human cells to combat HIV, cancer, Alzheimer's—even aging

Via Gerd Moe-Behrens
more...
Christopher Whelan's curator insight, October 28, 2013 4:43 PM

Time and time again, technology has shown to shine brightest when it is for the sake of mankind as a whole. Programming cells is such an advancement. The ability to do so could revolutionize how we view any type of illness, from the most common bacterial infection, to deadly viruses, to learning disabilities. It could eliminate the need for certain drugs without some of the severe side effects, as the body will be fighting its own battles, but doctors will be the generals in command. Is this technology a scary thought? Yes. Many questions would exist concerning what would happen if something went wrong. However, these questions have always existed, merely in different forms. The idea that a man could cut open another individual and manually repair an issue is terrifying, yet surgeries have become routine and are hardly given a second thought. Opening our minds to this kind of technology is the only way to move forward.

Rescooped by Yonghua Sun from Amazing Science
Scoop.it!

A new easier way to control genes should enable more complex synthetic biology circuits

A new easier way to control genes should enable more complex synthetic biology circuits | zebrafish | Scoop.it

The new method is based on a system of viral proteins that have been exploited recently to edit the genomes of bacterial and human cells. The original system, called CRISPR, consists of two components: a protein that binds to and slices DNA, and a short strand of RNA that guides the protein to the right location on the genome. 

“The CRISPR system is quite powerful in that it can be targeted to different DNA binding regions based on simple recoding of these guide RNAs,” Lu says. “By simply reprogramming the RNA sequence you can direct this protein to any location you want on the genome or on a synthetic circuit.”

Lead author of author of a paper describing the new approach in the journal ACS Synthetic Biology is Fahim Farzadfard, an MIT graduate student in biology. Samuel Perli, a graduate student in electrical engineering and computer science, is also an author. 

In previous studies, CRISPR has been used to snip out pieces of a gene to disable it or replace it with a new gene. Lu and his colleagues decided to use the CRISPR system for a different purpose: controlling gene transcription, the process by which a sequence of DNA is copied into messenger RNA (mRNA), which carries out the gene’s instructions.

Transcription is tightly regulated by proteins called transcription factors. These proteins bind to specific DNA sequences in the gene’s promoter region and either recruit or block the enzymes needed to copy that gene into mRNA.

For this study, the researchers adapted the CRISPR system to act as a transcription factor. First, they modified the usual CRISPR protein, known as Cas9, so that it could no longer snip DNA after binding to it. They also added to the protein a segment that activates or represses gene expression by modulating the cell’s transcriptional machinery.

To get Cas9 to the right place, the researchers also delivered to the target cells a gene for an RNA guide that corresponds to a DNA sequence on the promoter of the gene they want to activate.

The researchers showed that once the RNA guide and the Cas9 protein join together inside the target cell, they accurately target the correct gene and turn on transcription. To their surprise, they found that the same Cas9 complex could also be used to block gene transcription if targeted to a different part of the gene.

“This is nice in that it allows you do to positive and negative regulation with the same protein, but with different guide RNAs targeted to different positions in the promoter,” Lu says.

The new system should be much easier to use than two other recently developed transcription-control systems based on DNA-binding proteins known as zinc fingers and transcription activator-like effector nucleases (TALENs), Lu says. Although they are effective, designing and assembling the proteins is time-consuming and expensive. 

“There’s a lot of flexibility with CRISPR, and it really comes from the fact that you don’t have to spend any more time doing protein engineering. You can just change the nucleic acid sequence of the RNAs,” Lu says.


Via Dr. Stefan Gruenwald
more...
No comment yet.
Rescooped by Yonghua Sun from CRISPR-Cas System for Eukaryotic Genome Engineering
Scoop.it!

RNA-guided gene activation by CRISPR-Cas9–based transcription factors

RNA-guided gene activation by CRISPR-Cas9–based transcription factors | zebrafish | Scoop.it

Pablo et al, 2013, Nat Met

Technologies for engineering synthetic transcription factors have enabled many advances in medical and scientific research. In contrast to existing methods based on engineering of DNA-binding proteins, we created a Cas9-based transactivator that is targeted to DNA sequences by guide RNA molecules. Coexpression of this transactivator and combinations of guide RNAs in human cells induced specific expression of endogenous target genes, demonstrating a simple and versatile approach for RNA-guided gene activation.


Via Amir Taheri Ghahfarokhi
more...
No comment yet.
Rescooped by Yonghua Sun from SynBioFromLeukipposInstitute
Scoop.it!

(HD) Dr. Michio Kaku: The Biotech Revolution - Vision of the Future - Full Documentary

(HD) Dr. Michio Kaku: The Biotech Revolution - Vision of the Future - Full Documentary ❶ HD Universe Channel: For all your Space, Universe and Science docume...

Via Gerd Moe-Behrens
more...
Gerd Moe-Behrens's comment, October 22, 2013 4:17 PM
They recently changed it. Try this link http://www.youtube.com/watch?v=CG8TekgNEhA
malek's comment, October 22, 2013 5:25 PM
@socrates Logos: this link is active, thank you for sharing
Gerd Moe-Behrens's comment, October 22, 2013 5:27 PM
Great - you are welcome
Scooped by Yonghua Sun
Scoop.it!

科学解谜:未来人们或可和自己生孩子 - 北国网

北国网
科学解谜:未来人们或可和自己生孩子
北国网
林和他的同事们成功地用小鼠的皮肤细胞,在体外培育了原始生殖细胞(primordial germ cells,PG细胞),用于发育生物学相关的研究。为了证明人工培育的PG细胞和生物体自然形成的并无区别,他将其培育成了卵细胞,再经体 ...
more...
No comment yet.
Rescooped by Yonghua Sun from SynBioFromLeukipposInstitute
Scoop.it!

Double Nicking by RNA-Guided CRISPR Cas9 for Enhanced Genome Editing Specificity

Double Nicking by RNA-Guided CRISPR Cas9 for Enhanced Genome Editing Specificity | zebrafish | Scoop.it

Via Gerd Moe-Behrens
more...
Gerd Moe-Behrens's curator insight, August 30, 2013 5:50 PM

by

F. Ann Ran, Patrick D. Hsu, Chie-Yu Lin, Jonthan S. Gootenberg, Silvana Konermann, Alexandro E. Trevino, David A. Scott, Azusa Inoue, Shogo Matoba, Yi Zhang, Feng Zhang


"Targeted genome editing technologies have enabled a broad range of research and medical applications. The Cas9 nuclease from the microbial CRISPR-Cas system is targeted to specific genomic loci by a 20 nt guide sequence, which can tolerate certain mismatches to the DNA target and thereby promote undesired off-target mutagenesis. Here, we describe an approach that combines a Cas9 nickase mutant with paired guide RNAs to introduce targeted double-strand breaks. Because individual nicks in the genome are repaired with high fidelity, simultaneous nicking via appropriately offset guide RNAs is required for double-stranded breaks and extends the number of specifically recognized bases for target cleavage. We demonstrate that using paired nicking can reduce off-target activity by 50- to 1,500-fold in cell lines and to facilitate gene knockout in mouse zygotes without sacrificing on-target cleavage efficiency. This versatile strategy enables a wide variety of genome editing applications that require high specificity."

http://bit.ly/15r5dAg