Amazing Science
813.8K views | +89 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Researchers Create Perfect Quantum Dots with Single-Atom Precision

Researchers Create Perfect Quantum Dots with Single-Atom Precision | Amazing Science |

A team of physicists from the Paul-Drude-Institut für Festkörperelektronik (PDI) in Berlin, Germany, NTT Basic Research Laboratories in Atsugi, Japan, and the U.S. Naval Research Laboratory (NRL) has used a scanning tunneling microscope to create quantum dots with identical, deterministic sizes. The perfect reproducibility of these dots opens the door to quantum dot architectures completely free of uncontrolled variations, an important goal for technologies from nanophotonics to quantum information processing as well as for fundamental studies. The complete findings are published in the July 2014 issue of the journal Nature Nanotechnology.

Quantum dots are often regarded as artificial atoms because, like real atoms, they confine their electrons to quantized states with discrete energies. But the analogy breaks down quickly, because while real atoms are identical, quantum dots usually comprise hundreds or thousands of atoms - with unavoidable variations in their size and shape and, consequently, in their properties and behavior. External electrostatic gates can be used to reduce these variations. But the more ambitious goal of creating quantum dots with intrinsically perfect fidelity by completely eliminating statistical variations in their size, shape, and arrangement has long remained elusive.

Creating atomically precise quantum dots requires every atom to be placed in a precisely specified location without error. The team assembled the dots atom-by-atom, using a scanning tunneling microscope (STM), and relied on an atomically precise surface template to define a lattice of allowed atom positions. The template was the surface of an InAs crystal, which has a regular pattern of indium vacancies and a low concentration of native indium adatoms adsorbed above the vacancy sites. The adatoms are ionized +1 donors and can be moved with the STM tip by vertical atom manipulation. The team assembled quantum dots consisting of linear chains of N = 6 to 25 indium atoms; the example shown here is a chain of 22 atoms.

Stefan Fölsch, a physicist at the PDI who led the team, explained that "the ionized indium adatoms form a quantum dot by creating an electrostatic well that confines electrons normally associated with a surface state of the InAs crystal. The quantized states can then be probed and mapped by scanning tunneling spectroscopy measurements of the differential conductance." These spectra show a series of resonances labeled by the principal quantum number n. Spatial maps reveal the wave functions of these quantized states, which have n lobes and n - 1 nodes along the chain, exactly as expected for a quantum-mechanical electron in a box. For the 22-atom chain example, the states up to n = 6 are shown.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Papers!

Sex Determination: Why So Many Different Ways in Nature of Doing It?

Sex Determination: Why So Many Different Ways in Nature of Doing It? | Amazing Science |

Sexual reproduction is an ancient feature of life on earth, and the familiar X and Y chromosomes in humans and other model species have led to the impression that sex determination mechanisms are old and conserved. In fact, males and females are determined by diverse mechanisms that evolve rapidly in many taxa. Yet this diversity in primary sex-determining signals is coupled with conserved molecular pathways that trigger male or female development. Conflicting selection on different parts of the genome and on the two sexes may drive many of these transitions, but few systems with rapid turnover of sex determination mechanisms have been rigorously studied. Here we survey our current understanding of how and why sex determination evolves in animals and plants and identify important gaps in our knowledge that present exciting research opportunities to characterize the evolutionary forces and molecular pathways underlying the evolution of sex determination.

Bachtrog D, Mank JE, Peichel CL, Kirkpatrick M, Otto SP, et al. (2014) Sex Determination: Why So Many Ways of Doing It? PLoS Biol 12(7): e1001899.

Via Complexity Digest
Arjen ten Have's curator insight, July 7, 2014 12:27 PM

I am getting more and more bored by all these pretentious sounding papers in plos, although mostly in plosone. Here is another one, my question is Why is that even a question? Meaning it is rather obvious. One of things that apparently are still not clear is the difference between hard and soft selection. Hard being defined as having an impact irrespective of the environment, soft impact depending on the environment. Sexual selection is typically hard, hence any mutation that affects sex, like in "to have sex or not to have sex", is due to affect the offspring. Hence, if sex is an emergent property of evolution (and I for one believe it is), than many ways of sex signalling is a logic result of that emergent property. Or do I miss something?

Scooped by Dr. Stefan Gruenwald!

Researchers regrow human corneas in mice

Researchers regrow human corneas in mice | Amazing Science |

A restored functional cornea following transplantation of human ABCB5-positive limbal stem cells to limbal stem cell-deficient mice.

Limbal stem cells, which reside in the eye’s limbus, help maintain and regenerate corneal tissue. Their loss due to injury or disease is one of the leading causes of blindness.

In the past, tissue or cell transplants have been used to help the cornea regenerate, but it was unknown whether there were actual limbal stem cells in the grafts, or how many, and the outcomes were not consistent.

In this new study, researchers at the Massachusetts Eye and Ear/Schepens Eye Research Institute (Mass. Eye and Ear), Boston Children’s HospitalBrigham and Women’s Hospital, and the VA Boston Healthcare System used a molecule known as ABCB5, which acts as a marker for hard-to-find limbal stem cells.

ABCB5 allowed the researchers to locate hard-to-find limbal stem cells in tissue from deceased human donors and use these stem cells to regrow anatomically correct, fully functional human corneas in mice.

“Limbal stem cells are very rare, and successful transplants are dependent on these rare cells,” says Bruce Ksander, Ph.D., of Mass. Eye and Ear, co-lead author on the study with post-doctoral fellow Paraskevi Kolovou, M.D. “This finding will now make it much easier to restore the corneal surface. It’s a very good example of basic research moving quickly to a translational application.”

ABCB5 was originally discovered in the lab of Markus Frank, M.D., of Boston Children’s Hospital, and Natasha Frank, M.D., of the VA Boston Healthcare System and Brigham and Women’s Hospital (co-senior investigators on the study) as being produced in tissue precursor cells in human skin and intestine.

In the new work, using a mouse model developed by the Frank lab, they found that ABCB5 also occurs in limbal stem cells and is required for their maintenance and survival, and for corneal development and repair. Mice lacking a functional ABCB5 gene lost their populations of limbal stem cells, and their corneas healed poorly after injury.

“ABCB5 allows limbal stem cells to survive, protecting them from apoptosis [programmed cell death],” says Markus Frank. “The mouse model allowed us for the first time to understand the role of ABCB5 in normal development, and should be very important to the stem cell field in general.” according to Natasha Frank.

Markus Frank is working with the biopharmaceutical industry to develop a clinical-grade ABCB5 antibody that would meet U.S. regulatory approvals.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A graphene replacement made from plastic

A graphene replacement made from plastic | Amazing Science |

Spin-coating a polymer solution (green) to create a carbon nanosheet with characteristics similar to graphene, without the defects (black).

A team of Korean researchers has synthesized hexagonal carbon nanosheets similar to graphene, using a polymer. The new material is free of the defects and complexity involved in producing graphene, and can substitute for graphene as transparent electrodes for organic solar cells and in semiconductor chips, the researchers say. 

The research team is led by Han-Ik Joh at Korea Institute of Science and Technology  (KIST), Seok-In Na at Chonbuk National University, and Byoung Gak Kim at Korea Research Institute of Chemical Technology. The research was funded by the KIST Proprietary Research Project and National Research Foundation of Korea.

Na explains: "Through a catalyst- and transfer-free process, we fabricated indium tin oxide (ITO)-free organic solar cells (OSCs) using a carbon nanosheet (CNS) with properties similar to graphene. The morphological and electrical properties of the CNS is derived from a polymer of intrinsic microporosity-1 (PIM-1), which is mainly composed of several aromatic hydrocarbons and cycloalkanes, can be easily controlled by adjusting the polymer concentration. The CNSs, which are prepared by simple spin-coating and heat-treatment on a quartz substrate, are directly used as the electrodes of ITO-free OSCs, showing a high efficiency of approximately 1.922% under 100 mW cm−2 illumination and air mass 1.5 G conditions. This catalyst- and transfer-free approach is highly desirable for electrodes in organic electronics."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

3D model of Eta Carinae, a binary star system over 5 million times more luminous than the sun

3D model of Eta Carinae, a binary star system over 5 million times more luminous than the sun | Amazing Science |

An international team of astronomers has developed a 3D model of a giant cloud ejected by the massive binary system Eta Carinae during its 19th century outburst. Eta Carinae lies about 7,500 light-years away in the southern constellation of Carina and is one of the most massive binary systems astronomers can study in detail. The smaller star is about 30 times the mass of the sun and may be as much as a million times more luminous. The primary star contains about 90 solar masses and emits 5 million times the sun's energy output. Both stars are fated to end their lives in spectacular supernova explosions.

Between 1838 and 1845, Eta Carinae underwent a period of unusual variability during which it briefly outshone Canopus, normally the second-brightest star. As a part of this event, which astronomers call the Great Eruption, a gaseous shell containing at least 10 and perhaps as much as 40 times the sun's mass was shot into space. This material forms a twin-lobed dust-filled cloud known as the Homunculus Nebula, which is now about a light-year long and continues to expand at more than 1.3 million mph (2.1 million km/h). 

Using the European Southern Observatory's Very Large Telescope and its X-Shooter spectrograph, the team imaged near-infrared, visible and ultraviolet wavelengths along 92 separate swaths across the nebula, making the most complete spectral map to date. The researchers have used the spatial and velocity information provided by this data to create the first high-resolution 3D model of the Homunculus Nebula.

The shape model was developed using only a single emission line of near-infrared light emitted by molecular hydrogen gas. The characteristic 2.12-micron light shifts in wavelength slightly depending on the speed and direction of the expanding gas, allowing the team to probe even dust-obscured portions of the Homunculus that face away from Earth.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

FingerReader: MIT finger device reads to the blind in real time

FingerReader: MIT finger device reads to the blind in real time | Amazing Science |

Scientists at the Massachusetts Institute of Technology are developing an audio reading device to be worn on the index finger of people whose vision is impaired, giving them affordable and immediate access to printed words.

The so-called FingerReader, a prototype produced by a 3-D printer, fits like a ring on the user’s finger, equipped with a small camera that scans text. A synthesized voice reads words aloud, quickly translating books, restaurant menus and other needed materials for daily living, especially away from home or office.

Reading is as easy as pointing the finger at text. Special software tracks the finger movement, identifies words and processes the information. The device has vibration motors that alert readers when they stray from the script, said Roy Shilkrot, who is developing the device at the MIT Media Lab.

For Jerry Berrier, 62, who was born blind, the promise of the FingerReader is its portability and offer of real-time functionality at school, a doctor’s office and restaurants.

“When I go to the doctor’s office, there may be forms that I wanna read before I sign them,” Berrier said.

He said there are other optical character recognition devices on the market for those with vision impairments, but none that he knows of that will read in real time.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Nanoscribe claims world’s fastest commercially available nano-3D printer title

Nanoscribe claims world’s fastest commercially available nano-3D printer title | Amazing Science |
Nanoscribe GmbH, a spin-off of Karlsruhe Institute of Technology (KIT), has built the world’s fastest 3D printer of micro- and nanostructures.

At the Photonics West, the leading international fair for photonics taking place in San Francisco (USA) this week, Nanoscribe GmbH, a spin-off of Karlsruhe Institute of Technology (KIT), presents the world’s fastest 3D printer of micro- and nanostructures. With this printer, smallest three-dimensional objects, often smaller than the diameter of a human hair, can be manufactured with minimum time consumption and maximum resolution. The printer is based on a novel laser lithography method.


“The success of Nanoscribe is an example of KIT’s excellent entrepreneurial culture and confirms our strategy of specifically supporting spin-offs. In this way, research results are transferred rapidly and sustainably to the market,” says Dr. Peter Fritz, KIT Vice President for Research and Innovation. In early 2008, Nanoscribe was founded as the first spin-off of KIT and has since established itself as the world’s market and technology leader in the area of 3D laser lithography.


Last year, 18 spin-offs were established at KIT. The 3D laser litho-graphy systems developed by Nanoscribe – the spin-off can still be found on KIT’s Campus North - are used for research by KIT and scientists worldwide. Work in the area of photonics concentrates on replacing conventional electronics by optical circuits of higher performance. For this purpose, Nanoscribe systems are used to print polymer waveguides reaching data transfer rates of more than 5 terabits per second.


Biosciences produce tailored scaffolds for cell growth studies among others. In materials research, functional materials of enhanced performance are developed for lightweight construction to reduce the consumption of resources. Among the customers are universities and research institutions as well as industrial companies.


Increased Speed: Hours Turn into Minutes

By means of the new laser lithography method, printing speed is increased by factor of about 100. This increase in speed results from the use of a galvo mirror system, a technology that is also applied in laser show devices or scanning units of CD and DVD drives. Reflecting a laser beam off the rotating galvo mirrors facilitates rapid and precise laser focus positioning. “We are revolutionizing 3D printing on the micrometer scale. Precision and speed are achieved by the industrially established galvo technology. Our product benefits from more than one decade of experience in photonics, the key technology of the 21st century,” says Martin Hermatschweiler, the managing director of Nanoscribe GmbH.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Newly declared species may have been the largest flying bird that ever lived

Newly declared species may have been the largest flying bird that ever lived | Amazing Science |
After decades with the title, an extinct bird loses its claim to the widest wing span in history.

When South Carolina construction workers came across the giant, winged fossil at the Charleston airport in 1983, they had to use a backhoe to pull the bird, which lived about 25 million years ago, up from the earth.

But if the bird was actually a brand-new species, researchers faced a big question: Could such a large bird, with a wingspan of 20 to 24 feet, actually get off the ground? After all, the larger the bird, the less likely its wings are able to lift it unaided.

The answer came from Dan Ksepka, paleontologist and science curator at the Bruce Museum in Greenwich, Conn.

He modeled a probable method of flight for the long-extinct bird, named as a new species this week in the Proceedings of the National Academy of Sciences. If Ksepka’s simulations are correct, Pelagornis sandersi would be the largest airborne bird ever discovered.

Pelagornis sandersi relied on the ocean to keep it aloft. Similar in many ways to a modern-day albatross — although with at least twice the wingspan and very different in appearance, Ksepka said — the bird probably needed a lot of help to fly. It had to run downhill into a head wind, catching the air like a hang glider. Once airborne, it relied on air currents rising from the ocean to keep it gliding.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New Technique Provides a Clear and Rapid Means of Classifying Supernova Remnants

New Technique Provides a Clear and Rapid Means of Classifying Supernova Remnants | Amazing Science |

By observing specific X-ray emissions from iron atoms in the core of supernova remnants, astronomers developed a new technique that provides a clear and rapid means of classifying supernova remnants.

An international team of astronomers using data from the Japan-led Suzaku X-ray observatory has developed a powerful technique for analyzing supernova remnants, the expanding clouds of debris left behind when stars explode. The method provides scientists with a way to quickly identify the type of explosion and offers insights into the environment surrounding the star before its destruction.

“Supernovae imprint their remnants with X-ray evidence that reveals the nature of the explosion and its surroundings,” said lead researcher Hiroya Yamaguchi, an astrophysicist at NASA’s Goddard Space Flight Center in Greenbelt, Maryland. “Thanks to Suzaku, we are now learning how to interpret these signals.”

The technique involves observing specific X-ray emissions from iron atoms in the core of supernova remnants. Even after thousands of years, these atoms remain extremely hot, stripped of most of the 26 electrons that accompany iron atoms under normal conditions on Earth. The metal is formed in the centers of shattered stars toward the end of their energy-producing lives and in their explosive demise, which makes it a key witness to stellar death.

“Because Suzaku has a better sensitivity to iron emission lines than any other X-ray mission, it’s the ideal tool for investigating supernova remnants at these energies,” said Robert Petre, chief of Goddard’s X-ray Astrophysics Laboratory and a member of the study team. Suzaku was launched into Earth orbit in 2005, the fifth in a series of Japanese X-ray astronomy satellites. It was developed and is operated cooperatively by the United States and Japan.

Astronomers estimate that a supernova occurs once or twice a century in our home galaxy, the Milky Way. Each time, a blast wave and a shell of hot stellar debris expands rapidly away from the detonation, creating a supernova remnant that can be detected for tens of thousands of years. The expanding cloud slows over time as it mixes with interstellar gas and eventually becomes indistinguishable from it.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Parasitic wasp turns roaches into zombie slaves using a neurotoxic cocktail

Parasitic wasp turns roaches into zombie slaves using a neurotoxic cocktail | Amazing Science |

For decades, scientists have tried to understand the complex and gruesome relationship between the parasitic emerald wasp Ampulex compressa and its much larger victim, the common household cockroach Periplaneta americana.

At first glance, this parasite-prey relationship seems much like any other: the female wasp stings the cockroach, lays an egg on its abdomen, and once hatched, the hungry larva feeds on the cockroach. However, while most parasitic insects tend to paralyse their victims with a venomous sting, the emerald wasp instead manipulates the cockroach’s behaviour, essentially transforming it into a zombie slave.

With two stings the cockroach is left with the ability to walk, but is entirely robbed of the power to initiate its own movement. The wasp, now tired after administering two stings, regains its energy by cutting off the ends of the cockroach’s antennae, and drinking its blood. Revitalised, it then latches on to the stung cockroach’s antennae and, much like an obedient toddler being lead to his first day of school, the submissive insect follows the wasp’s orders.

The first sting, administered to a mass of nerve tissue in the cockroach’s thorax, contains large quantities of gamma amino butyric acid (GABA), and complementary chemicals called taurine and beta alanine. GABA is a neurotransmitter that blocks the transmission of motor signals between nerves, and, together with the other two chemicals, it temporarily paralyses the cockroach’s front legs. This prevents the cockroach from escaping while the wasp inflicts the second, more toxic sting directly into the roach’s brain.

It is the second sting that turns the cockroach into a zombie, and contains what Frederic Libersat and his colleagues at Ben Gurion University refer to as a “neurotoxic cocktail”. The venom of the second sting blocks the receptors for another neurotransmitter called octopamine, which is involved in the initiation of spontaneous and complex movements such as walking.

Libersat has shown that unstung cockroaches injected with an octopamine-like compound show an increase in walking behaviour. Those injected with a chemical that blocks octopamine, however, show a reduction in spontaneous walking, much like the victims of the wasp sting. Zombie cockroaches were also able to recover from their stupor and walk after they were injected with a chemical that reactivates octopamine receptors.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Size of the human genome reduced to 19,000 genes

Size of the human genome reduced to 19,000 genes | Amazing Science |

A study led by Alfonso Valencia, Vice-Director of Basic Research at the Spanish National Cancer Research Centre (CNIO) and head of the Structural Computational Biology Group, and Michael Tress, researcher at the Group, updates the number of human genes -those that can generate proteins- to 19,000; 1,700 fewer than the genes in the most recent annotation, and well below the initial estimations of 100,000 genes. The work, published in the journal Human Molecular Genetics, concludes that almost all of these genes have ancestors prior to the appearance of primates 50 million years ago.

"The shrinking human genome," that's how Valencia describes the continuous corrections to the numbers of the protein-coding genes in the human genome over the years that has culminated in the approximately 19,000 human genes described in the present work. "The coding part of the genome [which produces proteins] is constantly moving," he adds: "No one could have imagined a few years ago that such a small number of genes could make something so complex."

The scientists began by analysing proteomics experiments; proteomics is the most powerful tool to detect protein molecules. In order to determine a map of human proteins the researchers integrated data from seven large-scale mass spectrometry studies, from more than 50 human tissues, "in order to verify which genes really do produce proteins " says Valencia.

The results brought to light just over 12,000 proteins and the researchers mapped these proteins to the corresponding regions of the genome. They analysed thousands of genes that were annotated in the human genome, but that did not appear in the proteomics analysis and concluded: "1,700 of the genes that are supposed to produce proteins almost certainly do not for various reasons, either because they do not exhibit any protein coding features, or because the conservation of their reading frames does not support protein coding ability, "says Tress.

One hypothesis derived from the study is that more than 90% of human genes produce proteins that originated in metazoans or multicellular organisms of the animal kingdom hundreds of millions of years ago; the figure is over 99% for those genes whose origin predates the emergence of primates 50 million years ago.

"Our figures indicate that the differences between humans and primates at the level of genes and proteins are very small," say the researchers. David Juan, author and researcher in the Valencia lab, says that "the number of new genes that separate humans from mice [those genes that have evolved since the split from primates] may even be fewer than ten." This contrasts with the more than 500 human genes with origins since primates that can be found in the current annotation. The researchers conclude: "The physiological and developmental differences between primates are likely to be caused by gene regulation rather than by differences in the basic functions of the proteins in question."

The sources of human complexity lie more in how genes are used rather than on the number of genes, in the thousands of chemical changes that occur in proteins or in the control of the production of these proteins by non-coding regions of the genome, which comprise 90% of the entire genome and which have been described in the latest findings of the international ENCODE project, a Project in which the Valencia team participates.

The work brings the number of human genes closer to other species such as the nematode worms Caenorhabditis elegans, worms that are just 1mm long, but apparently less complex than humans. But Valencia prefers not to make comparisons: "The human genome is the best annotated, but we still believe that 1,700 genes may have to be re-annotated. Our work suggests that we will have to redo the calculations for all genomes, not only the human genome."

The research results are part of GENCODE, a consortium which is integrated into the ENCODE Project and formed by research groups from around the world, including the Valencia team, whose task is to provide an annotation of all the gene-based elements in the human genome.

"Our data are being discussed by GENCODE for incorporation into the new annotations. When this happens it will redefine the entire mapping of the human genome, and how it is used in macro projects such as those for cancer genome analysis ," says Valencia.

Laura E. Mirian, PhD's curator insight, July 8, 2014 10:42 AM

"Our figures indicate that the differences between humans and primates at the level of genes and proteins are very small," say the researchers. 

Scooped by Dr. Stefan Gruenwald!

Small, but plentiful: how the faintest galaxies illuminated the early universe

Small, but plentiful: how the faintest galaxies illuminated the early universe | Amazing Science |

Light from tiny galaxies over 13 billion years ago played a larger role than previously thought in creating the conditions in the universe as we know it today, a new study has found. Ultraviolet (UV) light from stars in these faint dwarf galaxies helped strip interstellar hydrogen of electrons in a process called re-ionization.

The epoch of re-ionization began about 200 million years after the Big Bang and astrophysicists agree that it took about 800 million more for the entire universe to become re-ionized. It marked the last major phase transition of gas in the universe, and it remains ionized today.

Astrophysicists aren’t in agreement when it comes to determining which type of galaxies played major roles in this epoch. Most have focused on large galaxies. However, a new theory by researchers at the Georgia Institute of Technology and the San Diego Supercomputer Center indicates scientists should also focus on the smallest.  The findings are reported in a paper published today in the journal Monthly Notices of the Royal Astronomical Society.

The researchers used computer simulations to demonstrate the faintest and smallest galaxies in the early universe were essential. These tiny galaxies – despite being 1000 times smaller in mass and 30 times smaller in size than the Milky Way – contributed nearly 30 percent of the UV light during this process.

Re-ionization experts often ignored these dwarf galaxies because they didn’t think they formed stars. It is assumed that UV light from nearby galaxies was too strong and suppressed these tiny neighbors.

“It turns out they did form stars, usually in one burst, around 500 million years after the Big Bang,” said John Wise, a Georgia Tech assistant professor in the School of Physics who led the study. “The galaxies were small, but so plentiful that they contributed a significant fraction of UV light in the re-ionization process.”

The team’s simulations modeled the flow of UV stellar light through the gas within galaxies as they formed. They found that the fraction of ionizing photons escaping into intergalactic space was 50 percent in small (more than 10 million solar masses) halos. It was only 5 percent in larger halos (300 million solar masses).  This elevated fraction, combined with their high abundance, is exactly the reason why the faintest galaxies play an integral role during re-ionization.

“It’s very hard for UV light to escape galaxies because of the dense gas that fills them,” said Wise. “In small galaxies, there’s less gas between stars, making it easier for UV light to escape because it isn’t absorbed as quickly. Plus, supernova explosions can open up channels more easily in these tiny galaxies in which UV light can escape.”

The team’s simulation results provide a gradual timeline that tracks the progress of re-ionization over hundreds of millions of years. About 300 million years after the Big Bang, the universe was 20 percent ionized. It was 50 percent at 550 million years. The universe was fully ionized at 860 million years after its creation.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Newly spotted frozen world orbits in a binary star system

Newly spotted frozen world orbits in a binary star system | Amazing Science |

A newly discovered planet in a binary star system located 3,000 light-years from Earth is expanding astronomers’ notions of where Earth-like—and even potentially habitable—planets can form, and how to find them.

At twice the mass of Earth, the planet orbits one of the stars in the binary system at almost exactly the same distance from which Earth orbits the sun. However, because the planet’s host star is much dimmer than the sun, the planet is much colder than the Earth—a little colder, in fact, than Jupiter’s icy moon Europa.

Four international research teams, led by professor Andrew Gould of The Ohio State University, published their discovery in the July 4 issue of the journal Science.

The study provides the first evidence that terrestrial planets can form in orbits similar to Earth’s, even in a binary star system where the stars are not very far apart. Although this planet itself is too cold to be habitable, the same planet orbiting a sun-like star in such a binary system would be in the so-called “ habitable zone” —the region where conditions might be right for life.

“This greatly expands the potential locations to discover habitable planets in the future,” said Scott Gaudi, professor of astronomy at Ohio State. “Half the stars in the galaxy are in binary systems. We had no idea if Earth-like planets in Earth-like orbits could even form in these systems. ”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A video map of motions in the nearby universe

A video map of motions in the nearby universe | Amazing Science |

An international team of researchers, including University of Hawaii at Manoa astronomer Brent Tully, has mapped the motions of structures of the nearby universe in greater detail than ever before. The maps are presented as a video, which provides a dynamic three-dimensional representation of the universe through the use of rotation, panning, and zooming. The video was announced last week at the conference "Cosmic Flows: Observations and Simulations" in Marseille, France, that honored the career and 70th birthday of Tully.

The Cosmic Flows project has mapped visible and dark matter densities around our Milky Way galaxy up to a distance of 300 million light-years.

The team includes Helene Courtois, associate professor at the University of Lyon, France, and associate researcher at the Institute for Astronomy (IfA), University of Hawaii (UH) at Manoa, USA; Daniel Pomarede, Institute of Research on Fundamental Laws of the Universe, CEA/Saclay, France; Brent Tully, IfA, UH Manoa; and Yehuda Hoffman, Racah Institute of Physics, University of Jerusalem, Israel.

The large-scale structure of the universe is a complex web of clusters, filaments, and voids. Large voids—relatively empty spaces—are bounded by filaments that form superclusters of galaxies, the largest structures in the universe. Our Milky Way galaxy lies in a supercluster of 100,000 galaxies.

Just as the movement of tectonic plates reveals the properties of Earth's interior, the movements of the galaxies reveal information about the main constituents of the Universe: dark energy and dark matter. Dark matter is unseen matter whose presence can be deduced only by its effect on the motions of galaxies and stars because it does not give off or reflect light. Dark energy is the mysterious force that is causing the expansion of the universe to accelerate.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Engineered red blood cells could carry precious therapeutic cargo

Engineered red blood cells could carry precious therapeutic cargo | Amazing Science |

Whitehead Institute scientists have genetically and enzymatically modified red blood cells to carry a range of valuable payloads—from drugs, to vaccines, to imaging agents—for delivery to specific sites throughout the body.

“We wanted to create high-value red cells that do more than simply carry oxygen,” says Whitehead Founding Member Harvey Lodish, who collaborated with Whitehead Member Hidde Ploegh in this pursuit. “Here we’ve laid out the technology to make mouse and human red blood cells in culture that can express what we want and potentially be used for therapeutic or diagnostic purposes.”

The work, published this week in the Proceedings of the National Academy of Sciences (PNAS), combines Lodish’s expertise in the biology of red blood cells (RBCs) with biochemical methods developed in Ploegh’s lab.

RBCs are an attractive vehicle for potential therapeutic applications for a variety of reasons, including their abundance—they are more numerous than any other cell type in the body—and their long lifespan (up to 120 days in circulation). Perhaps most importantly, during RBC production, the progenitor cells that eventually mature to become RBCs jettison their nuclei and all DNA therein. Without a nucleus, a mature RBC lacks any genetic material or any signs of earlier genetic manipulation that could result in tumor formation or other adverse effects.

Exploiting this characteristic, Lodish and his lab introduced genes coding for specific slightly modified normal red cell surface proteins into early-stage RBC progenitors. As the RBCs approach maturity and enucleate, the proteins remain on the cell surface, where they are modified by Ploegh’s protein-labeling technique. Referred to as “sortagging,” the approach relies on the bacterial enzyme sortase A to establish a strong chemical bond between the surface protein and a substance of choice, be it a small-molecule therapeutic or an antibody capable of binding a toxin. The modifications leave the cells and their surfaces unharmed.

“Because the modified human red blood cells can circulate in the body for up to four months, one could envision a scenario in which the cells are used to introduce antibodies that neutralize a toxin,” says Ploegh. “The result would be long-lasting reserves of antitoxin antibodies.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Segway Inventor Dean Kamen Thinks His New Stirling Engine Will Get People Off The Grid For Under $10K

Segway Inventor Dean Kamen Thinks His New Stirling Engine Will Get People Off The Grid For Under $10K | Amazing Science |
"Ten years from today the probability that you are depending on wires hanging on tree branches is as likely as that you'll still be installing land lines for telephones. Close to zero."

Inventor Dean Kamen is planning a 2.5 kW home version of hisDeka Research Beacon 10 Stirling engine that could provide efficient around-the-clock power or hot water to a home or business, reports ForbesKamen says the current Beacon is intended for businesses like laundries or restaurants that use a lot of hot water. “With commercialization partner NRG Energy, he’s deployed roughly 20 of the machines and expects to put them into production within 18 months,” says Forbes.

But Kamen has bigger plans: feeding excess power to the grid by networking devices across a region together. Depending on the price of natural gas, “ten years from today the probability that you are depending on wires hanging on tree branches is as likely as that you’ll still be installing land lines for telephones,” he says. “Close to zero.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

IBM aims for commercializing its first carbon nanotube transistors in the early 2020s

IBM aims for commercializing its first carbon nanotube transistors in the early 2020s | Amazing Science |

IBM has announced that it expects to have commercialised its carbon nanotube transistor technology in the early 2020s, thanks to a new design that would allow the transistors to be built on silicon wafers using similar techniques to existing chip manufacturing plants.

The semiconductor industry has been working hard for the last few decades on following Moore's Law, the observation by Intel co-founder Gordon Moore that the number of transistors on a chip tends to double roughly every eighteen months. In recent years, following that trend has become increasingly complex: the ever-shrinking size of the components and the distance between them makes manufacturing difficult, while interference between components must be corrected and designed out.

One possible solution is a move away from traditional semiconductor designs, and numerous companies are working on exactly that. Back in 2012, IBM announced the creation of a 9nm carbon nanotube resistor, dropping below the 10nm barrier for the first time. In September last year, the company further announced that it had used the transistors to build a fully working computer for the first time, but the company was still silent as to when the technology would be likely to leave the lab and reach shop shelves.

Speaking to MIT's Technology Review, IBM researchers have finally given themselves a deadline: to have commercialised carbon nanotube transistor semiconductors by the early 2020s. The secret is a shift in design, featuring six nanotubes measuring 1.4nm in width lined up in parallel, to build the transistors. This design, the company has claimed, could potentially be manufactured using current semiconductor fabrication plants with little modification - the route-to-market the technology desperately needed.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

How wet is Earth's soil? NASA's Aquarius Returns Global Maps of Soil Moisture

How wet is Earth's soil? NASA's Aquarius Returns Global Maps of Soil Moisture | Amazing Science |

Soil moisture, the water contained within soil particles, is an important player in Earth's water cycle. It is essential for plant life and influences weather and climate. Satellite readings of soil moisture will help scientists better understand the climate system and have potential for a wide range of applications, from advancing climate models, weather forecasts, drought monitoring and flood prediction to informing water management decisions and aiding in predictions of agricultural productivity.

Launched June 10, 2011, aboard the Argentinian spacecraft Aquarius/Satélite de Aplicaciones Científicas (SAC)-D, Aquarius was built to study the salt content of ocean surface waters. The new soil wetness measurements were not in the mission's primary science objectives, but a NASA-funded team led by U.S. Department of Agriculture (USDA) researchers has developed a method to retrieve soil moisture data from the instrument's microwave radiometer.

The Aquarius measurements are considerably coarser in spatial resolution than the measurements from the upcoming NASASoil Moisture Active Passive (SMAP) mission, which was specifically designed to provide the highest quality soil moisture measurements available, including a spatial resolution 10 times that offered by Aquarius.

Soils naturally radiate microwaves and the Aquarius sensor can detect the microwave signal from the top 2 inches (5 centimeters) of the land, a signal that subtly varies with changes in the wetness of the soil. Aquarius takes eight days to complete each worldwide survey of soil moisture, although with gaps in mountainous or vegetated terrain where the microwave signal becomes difficult to interpret.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

MIT: Send Your Own Message, Selfie, Video / Audio Clips To Mars On This High-Tech Time Capsule

MIT: Send Your Own Message, Selfie, Video / Audio Clips To Mars On This High-Tech Time Capsule | Amazing Science |
The first private mission to the Red Planet aims to carry selfies with it.

The first human visitors to Mars may have a warm reminder of home to greet them when they land. With the help of major partners and the nonprofit Explore Mars, a team of college students is now raising money to launch the first private mission to the Red Planet. It won't carry people but messages.

For a 99-cent contribution, anyone can send their submissions of images, messages, audio clips, and videos to put on the time capsule. The team hopes to send the $25 million project to Mars in five years, far ahead of any manned mission.

The idea germinated at a TGI Fridays. Emily Briere, a Duke University engineering major, had just attended the Humans to Mars Summit in Washington, D.C., and was chatting with her family and her family friend--inventor and space enthusiast Eric Knight--about how to breathe some life and excitement into some of the advanced technologies they had just learned about.

The group decided that a time capsule project--the challenge of creating a small craft that could land and survive on Mars, and maintain data from Earth--was the perfect mission: It would educate and excite a large number of people about space travel and unite people globally around a common goal. Importantly for raising the required money, it would also be useful in testing different technologies used to get the capsule to Mars.

“Our goal was to create almost an Apollo-era level of excitement,” says Briere, who is now a rising junior at Duke and serves as mission director for the Time Capsule to Mars project.

Partnered with MIT’s Space Propulsion Lab, the group will test the newest space engine technology--ion electrospray propulsion--which could reduce travel time to Mars to as little as four months. It will also test out other designs, including “delay tolerant networking" systems that will help the capsule transmit data from deep space and optical quartz storage technology that will encode terabytes of time capsule data for millions of years. According to Briere, the biggest current technological challenge is maintaining communications with the capsule from 140 million miles away--they plan to try out inflatable antennas.

The project has already gained the support and advising of partners including Lockheed Martin, NASA, Stanford, Duke, UConn, and MIT, among other organizations, as well as two former NASA chief astronauts. Students at a growing network of universities are collaborating on the design, technical, business, and marketing plans (Briere's three siblings are also involved). The group is also focusing on an educational mission “to close the gap that currently exists between student interest and the opportunities available to advance space exploration.” Individuals will be able to take part in the mission through virtual Mission Control portals and get involved in scientific experiments and data that will be collected on board.

Mark Menegon's curator insight, May 6, 2017 9:06 PM
Students can spark their interest in critical and creative thinking with the idea of a mission to mars.  Submission of images, messages, audio clips or videos can be a project for students to produce group projects to include in a time capsule to send to Mars.  For the small financial fee of 99 cents, the school students could send their submissions on behalf of the school.  This would also be exciting for the teachers!!
Scooped by Dr. Stefan Gruenwald!

In Search for Extraterrestrial Life - Archive of Articles (from SETI and

In Search for Extraterrestrial Life - Archive of Articles (from SETI and | Amazing Science |

A checklist for the requirements of life as scientists define it could help ground speculation about the possibilities of alien life on distant worlds, new research suggests. Astronomers have confirmed the existence of more than 1,700 planets beyond the solar system, and may soon prove the existence of thousands more of such exoplanets.

"As we find more and more exoplanets, we are certainly going to discover worlds that resemble Earth to some degree," said study author Chris McKay, an astrobiologist at NASA Ames Research Center in Moffett Field, California. "This raises the question of whether or not such exoplanets could support life and what kind of life might live there."

To understand whether life might exist on alien worlds, McKay suggested scientists should evaluate both the requirements for life on Earth and the limits of life on Earth. Although scientific understanding of the requirements for life has not changed in many years, researchers' thoughts on the limits of life are have altered significantly in the past few decades, McKay said.

There are four general categories of the requirements for life on Earth: energy, carbon, liquid water and miscellaneous factors, McKay said.

The energy for life on Earth all comes from the shuffling of electrons from molecule to molecule by chemical reactions, which is driven by light-absorbing proteins in the case of photosynthesis. Carbon is the backbone of life on Earth because it can support an extraordinary variety of molecules for use in biology. Liquid water serves as the solvent in which the chemical reactions of life on Earth take place. Other factors required by life on Earth include elements such as nitrogen, which is used to make proteins and DNA, among many other molecules.

McKay noted that life could dominate exoplanets, and hence be detectable over interstellar distances. But that would only happen if that life is powered by light, he said. Still, life may not need much light in all cases; algae on Earth that live in the deep sea or under ice can survive on sunlight at levels less than one-100,000th of what Earth receives, a value that exceeds the amount of light that Pluto receives by about 100-fold.

Light can also provide energy for life in ways beyond photosynthesis. For instance, on worlds like Saturn's moon Titan, sunlight generates molecules such as acetylene and hydrogen gas in the atmosphere that could be used for energy in alien biology.

"Microorganisms on the surface of Titan would have these food sources just coming down from the sky, no need to bother with photosynthesis," McKay explains.

To investigate the limits of life on Earth, researchers look at extremophiles, organisms that have adapted to live in environmental extremes, such as extremes of heat, cold and radiation. The highest temperature at which scientists know life can live has increased significantly, from 176 degrees F (80 degrees C) to a whopping 251 degrees F (122 degrees C), or well above boiling. Recently, investigators also discovered microbes can live in temperatures as cold as 5 degrees F (minus 15 degrees C), or well below freezing.

McKay suggested that many potential limits to life, such as acidity, saltiness or ultraviolet radiation, are unlikely to be extreme enough to stifle life. He said the most important parameter for Earth-like life may be the presence of liquid water, but studies of life in extreme deserts show that even a small amount of rain, fog, snow and even simple humidity can help sustain life. Moreover, alien life may not even need liquid water; the liquid hydrocarbons on Titan, for example, might serve as the basis for life, playing the same role water does for life on Earth.

Infospectives's curator insight, July 8, 2014 4:21 PM

For those hoping for a better life off-world...

Scooped by Dr. Stefan Gruenwald!

Reinterpretation of Cold Dark Matter in the Universe as a Bose-Einstein Condensate

Reinterpretation of Cold Dark Matter in the Universe as a Bose-Einstein Condensate | Amazing Science |

Newly published research signifies the reinterpretation of cold dark matter, opening up the possibility that it could be regarded as a very cold quantum fluid governing the formation of the structure of the Universe.

Tom Broadhurst, an Ikerbasque researcher at the UPV/EHU’s Department of Theoretical Physics, has participated alongside scientists of the National Taiwan University in a piece of research that explores cold dark matter in depth and proposes new answers about the formation of galaxies and the structure of the Universe. These predictions, published in the prestigious journal Nature Physics, are being contrasted with fresh data provided by the Hubble space telescope.

In cosmology, cold dark matter is a form of matter the particles of which move slowly in comparison with light, and interact weakly with electromagnetic radiation. It is estimated that only a minute fraction of the matter in the Universe is baryonic matter, which forms stars, planets and living organisms. The rest, comprising over 80%, is dark matter and energy.

The theory of cold dark matter helps to explain how the universe evolved from its initial state to the current distribution of galaxies and clusters, the structure of the Universe on a large scale. In any case, the theory was unable to satisfactorily explain certain observations, but the new research by Broadhurst and his colleagues sheds new light in this respect.

As the Ikerbasque researcher explained, “guided by the initial simulations of the formation of galaxies in this context, we have reinterpreted cold dark matter as a Bose-Einstein condensate”. So, “the ultra-light bosons forming the condensate share the same quantum wave function, so disturbance patterns are formed on astronomic scales in the form of large-scale waves”.

This theory can be used to suggest that all the galaxies in this context should have at their center large stationary waves of dark matter called solitons, which would explain the puzzling cores observed in common dwarf galaxies.

The research also makes it possible to predict that galaxies are formed relatively late in this context in comparison with the interpretation of standard particles of cold dark matter. The team is comparing these new predictions with observations by the Hubble space telescope.

The results are very promising as they open up the possibility that dark matter could be regarded as a very cold quantum fluid that governs the formation of the structure across the whole Universe. This research opened up fresh possibilities to conduct research into the first galaxies to emerge after the Big Bang.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Scientists have made light appear to break Newton’s third law

Scientists have made light appear to break Newton’s third law | Amazing Science |

Laser pulses have been made to accelerate themselves around loops of optical fibre-  which seems to go against Newton’s 3rd law. This states that for every action there is an equal and opposite reaction. This new research exploits a loophole with light that makes it appear to have mass.

Under Newton’s third law of motion, if we imagine one billiard ball striking another upon a pool table, the two balls will bounce away from each other. If one of the billiard balls had a negative mass, then the collision of the two balls would result in them accelerating in the same direction. This effect could be used in a diametric drive, where negative and positive mass interact for a continuously propulsive effect. Such a drive also relies on the assumption that negative mass has negative inertia. 

Quantum mechanics however states that matter cannot have a negative mass. Negative mass is not the same as antimatter, as even antimatter has positive mass. Negative mass is a hypothetical concept of matter where mass is of opposite sign to the mass of normal matter. Negative mass is used in speculative theories, such as the construction of wormholes. Should such matter exist, it would violate one or more energy conditions and show strange properties. No material object has ever been found that can be shown by experiment to have a negative mass.

Experimental physicist Ulf Peschel and his colleagues at the University of Erlangen-Nuremberg in Germany have now made a diametric drive using effective mass.. Photons travelling at the speed of light have no rest mass. Shining pulses of light into layered materials like crystals means some of the photons can be reflected backwards by one layer and forwards by another. This delays part of the pulse and interferes with the rest of the pulse as it passes more slowly through the material.

When a material such as layered crystals slows the speed of the light pulse in proportion to its energy, it is behaving as if it has mass. This is called effective mass, which is the mass that a particle appears to have when responding to forces. Light pulses can have a negative effective mass depending on the shape of their light waves and the structure of the crystal material that the light waves are passing through. To get a pulse to interact with material with a positive effective mass means finding a crystal that is so long that it can absorb the light before different pulses show a diametric drive effect.

Peschel therefore created a series of laser pulses in two loops of fibre-optic cable to get around these requirements. The pulses were split between the loops at a contact point and the light kept moving around each light in the same direction.

Infospectives's curator insight, July 8, 2014 5:45 PM

Anyone in the market for a man-made wormhole?

Scooped by Dr. Stefan Gruenwald!

Tweet This: FDA Finally Proposes Social Media Guidelines

Tweet This: FDA Finally Proposes Social Media Guidelines | Amazing Science |

After several years of anticipation, the FDA has finally proposed a pair of guidelines for how drug and device makers should cope with some of the challenges and pitfalls posed by social media.

One of the so-called draft guidances offers instructions on how companies should attempt to correct product information on websites that are run by others, such as chat rooms. The other addresses how products – including risk and benefit information – can be discussed in venues such as Twitter, as well as paid search links on Google and Yahoo, all of which have limited space. This will involve using links to product web sites, for instances, that can be clicked.

“These are intended to have a beneficial impact on public health,” Tom Abrams, who heads the FDA Office of Prescription Drug Promotion, tells us. “But these were not developed in a vacuum. They were developed with careful consideration and with input from industry and many other stakeholders. There was a lot of important consideration given to the issues.”

For third-party websites, such as Wikipedia, the draft guidance suggests that companies should feel free to correct misinformation, but that any correction must include balanced information and the source of the revision or update must be noted, Abrams explains. This means a company or company employee or contractor should be credited with any additions.

“The information should not be promotional and should be factually correct. This is not an opportunity for a company to tout its drugs,” he says. “The information [being added or revised] should be consistent with the FDA-approved [product] labeling and for it to be effective, you want it posted right by the misinformation.”

The guidance also says that companies should contact writers, such as bloggers, to make changes when they learn of misinformation. Abrams notes companies will not be held responsible for those who do not make changes. If none of this is possible, he says companies should contact web site operators and suggest they delete the misinformation or open the site to comments so that corrections can be made.

The guidelines are being released nearly five years after the FDA held a well-attended public hearing to sift through Internet issues confronting drug and device makers. But the guidelines never materialized, despite repeated signals they may be forthcoming. Now, FDA officials must act before a July deadline set by a 2012 law requiring them to release guidance on product promotion on the Internet.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New imaging method allows to see how the small intestine operates in real time

New imaging method allows to see how the small intestine operates in real time | Amazing Science |

“Nanojuice” could improve how doctors examine the small intestine.

Located deep in the human gut, the small intestine is not easy to examine. X-rays, MRIs and ultrasound images provide snapshots but each suffers from its own limitations.

University at Buffalo researchers are developing a new imaging technique involving nanoparticles suspended in liquid to form “nanojuice” that patients would drink. Upon reaching the small intestine, doctors would strike the nanoparticles with a harmless laser light, providing an unparalleled, non-invasive, real-time view of the organ.

Described July 6 in the journal Nature Nanotechnology, the advancement could help doctors better identify, understand and treat gastrointestinal ailments.

“Conventional imaging methods show the organ and blockages, but this method allows you to see how the small intestine operates in real time,” said corresponding author Jonathan Lovell, PhD, UB assistant professor of biomedical engineering. “Better imaging will improve our understanding of these diseases and allow doctors to more effectively care for people suffering from them.”

In laboratory experiments performed with mice, the researchers administered the nanojuice orally. They then used photoacoustic tomography (PAT), which is pulsed laser lights that generate pressure waves that, when measured, provide a real-time and more nuanced view of the small intestine.

The researchers plan to continue to refine the technique for human trials, and move into other areas of the gastrointestinal tract.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

DR5 Protein Helps Cells To Adapt—or Die

DR5 Protein Helps Cells To Adapt—or Die | Amazing Science |
Scientists show how cell stress both prevents and promotes cell suicide in a study that’s equally divisive.

A cellular stress pathway called the unfolded-protein-response (UPR) both activates and degrades death receptor 5 protein (DR5), which can promote or prevent cell suicide, according to a paper published inScience today (July 3). The theory is that initial stress blocks cell suicide, or apoptosis, to give the cell a chance to adapt, but that if the stress persists, it eventually triggers apoptosis.

“This work has made the most beautiful simplification of all this big complex mess. Basically, they identified and pinpointed the specific protein involved in the switching decision and explain how the decision is made,” said Alexei Korennykh, a professor of molecular biology at Princeton University, who was not involved in the work.

But Randal Kaufman of the Sanford-Burnham Medical Research Institute in La Jolla, California, was not impressed. He questioned the physiological relevance of the experiments supporting the authors’ main conclusions about this key cellular process.

Protein folding in a cell takes place largely in the endoplasmic reticulum (ER), but if the process goes awry, unfolded proteins accumulate, stressing the ER. This triggers the UPR, which shuts down translation, degrades unfolded proteins, and increases production of protein-folding machinery. If ER stress is not resolved, however, the UPR can also induce apoptosis.

Two main factors control the UPR—IRE1a and PERK. IRE1a promotes cell survival by activating the transcription factor XBP1, which drives expression of cell-survival genes. PERK, on the other hand, activates a transcription factor called CHOP, which in turn drives expression of the proapoptotic factor DR5.

Peter Walter of the University of California, San Francisco, and his colleagues have now confirmed that CHOP activates DR5, showing that it is a cell-autonomous process. But they have also found that IRE1a suppresses DR5, directly degrading its mRNA through a process called regulated IRE1a-dependent degradation (RIDD). Inhibition of IRE1a in a human cancer cell line undergoing ER stress both prevented DR5 mRNA decay and increased apoptosis.

No comment yet.