Amazing Science
826.1K views | +36 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Circulating Avian Influenza Viruses Closely Related to the 1918 Virus Have Pandemic Potential

Circulating Avian Influenza Viruses Closely Related to the 1918 Virus Have Pandemic Potential | Amazing Science |

Animal influenzas including bird or avian flu are thought to be the reservoir for deadly human strains like one which caused the 1918 pandemic.  Scientists have noted recently that the genes of the avian flu viruses are very similar to the 1918 strain.   Professor Yoshihiro Kawaoka of the University of Wisconsin-Madison followed this hunch by generating a strain call “1918-like” through combining genes from 8 current circulating avian flu virus strains.

Alarmingly, he found in testing that it has high potential to infect and cause transmission in humans  Further, a mere seven genetic changes is sufficient to generate a strain that is airborne transmissible.

To focus their attention on the genes that contribute mostly to the enhanced infectivity of the “1918-like” strain and the normal avian strains, the researchers went gene by gene. They created systematically strains that had only one gene from the 1918 strain against the genetic background of an otherwise typical avian influenza strain, and were able to show that the genes hemagluttinin (HA) and an RNA polymerase (PB2) are the strongest contributors to the pathogenicity of the human generated “1918-like” strain. The HA gene is what the flu virus uses to latch onto the exterior of a human cell.  The PB2 gene is what the flu uses to manufacture copies of itself.  Both were found to be more efficient in the 1918 and 1918-like strains.

At first the researchers found that the “1918-like” strain was not transmissible.  But with these two important genes in hand, they found that by adding them from the original 1918 strain give rise to transmission. This led them to try generating slight variations of “1918-like” to see whether it was quite easy to make a transmissible strain.

The researchers again focused on the HA and PB2 genes, making a very mutations.  Remarkably, one of their resulting strains, a “1918-like” with a mere 7 genetic changes across 3 genes, could infect and be transmitted their test subjects.

One bright spot in the research is that blood from people who were vaccinated against the more normal, 2009 seasonal flu strain, also react to the dangerous 1918-strains generated in Kawaoka’s laboratory, giving rise to hope that perhaps the population has some protection already. Influenza research of this type requires high level of safety procedures and precautions.  The work carried out by Kawaoka required what is called Biosafety Level 3.  This entails the use of negative pressure hoods, proper safety attire, and restricted access during experimentation.  The highest level is Biosafety Level 4 which is reserved for Ebola and other fast, deadly public health disease agents.

Amy Zhu's curator insight, April 23, 2017 4:39 AM
This article accentuates how risk management assists the OHS issues of safety and health in working in a research facility to prevent the spread of Avian Influenza Virus, through ill animals transmitted to humans. This environment is definitely hazardous as it can involved dangerous substances and chemcials that contribute to the avian strains. The exposure to the virus can greatly impact on one's safety and health thruogh contracting various symptoms that is likelt to be transmitted into another worker in the workplace,putting his or her well-being at risk as well if not treated.These risks can be managed through correct hygience and safety practices being conducted daily to upmost importance in order not to trigger an accidental virus outbreak. Clean full-covered attire should be worn during experimentation with hair tied back,limited access and sanitisied hands to avoid the transmission of the virus through poor hygience practices being conducted in the environment. Therefore,the OHS issues of safety and health has being addressed the managing these risks through great methods. 
Scooped by Dr. Stefan Gruenwald!

CHD8: Genetic basis for a distinct type of autism uncovered

CHD8: Genetic basis for a distinct type of autism uncovered | Amazing Science |

A variation in the CHD8 gene has a strong likelihood of leading to a type of autism accompanied by digestive problems, a larger head and wide-set eyes. 

“We finally got a clear-cut case of an autism-specific gene,” said Raphael Bernier, University of Washington associate professor of psychiatry and behavioral sciences and clinical director of the Autism Center at Seattle Children’s. He is one of the lead authors of a Cell paper published today, “Disruptive CHD8 Mutations Define a Subtype of Autism in Early Development.” 

Scientists at 13 institutions around the world collaborated on the project.

Autism may have many genetic and other causes, and can vary in how it affects individuals. Currently autism is diagnosed based on behavioral traits.

Today’s discovery is part of an emerging approach to studying the underlying mechanisms of autism and what those mean for people with the condition. Many research teams are trying to group subtypes of autism based on genetic profiles.

The approach could uncover hundreds more genetic mutations. Genetic testing for the various forms eventually could be offered to families to guide them on what to expect and how to care for their child.  

In their study of 6,176 children with autism spectrum disorder, researchers found 15 had a CHD8 mutation. All the cases had similarities in their physical appearance as well as sleep disturbances and gastrointestinal problems. Bernier and his team interviewed all 15 of the children.

To confirm the findings, the researchers worked with scientists at Duke University who study genetically modified zebrafish, a common laboratory model for gene mutation studies. After the researchers disrupted the fish’s CHD8 gene, the small fry developed large heads and wide set eyes. They then fed the naturally semi-transparent fish fluorescent pellets to observe their digestion. They found that the fish had problems discarding food waste and were constipated.

Bernier said this is the first time researchers have shown a definitive cause of autism from a genetic mutation. Previously identified genetic conditions like Fragile X, which accounts for a greater number of autism cases, are associated with other neurological impairments, such as intellectual disability, more than with autism. Although less than half a percent of all people with autism will have this subtype, Bernier said this study has many implications.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New to Google Earth: Ancient Flying Reptiles Database and Mapping Tool

New to Google Earth: Ancient Flying Reptiles Database and Mapping Tool | Amazing Science |

A newly developed website catalogs more than 1,300 specimens of extinct flying reptiles called pterosaurs, thus enabling users to map out the ancient creatures on Google Earth. The goal is to help researchers find trends in the evolution and diversity of theseancient winged reptiles.

"Having a very specific database like this, which is just for looking at individual fossil specimens of pterosaurs, is very helpful, because you can ask questions that you couldn't have answered with bigger databases [of more animals]," said Matthew McLain, a doctoral candidate in paleontology at Loma Linda University in California and one of the three developers of the site. McLain and his colleagues call their database PteroTerra

Pterosaurs were the first flying vertebrates. They lived between 228 million and 66 million years ago, and went extinct around the end of the Cretaceous period. During that time, this group evolved to be incredibly diverse. Some were tiny, like the sparrow-size Nemicolopterus crypticus, which lived 120 million years ago in what is now China. Others were simply huge, like Quetzalcoatlus, which was as tall as a giraffe and probably went around spearing little dinosaurs with its beak like a stork might snack on frogs.

Paleontological databases are common tools, because they allow researchers to navigate through descriptions of fossil specimens. One of the largest, the Paleobiology Database, has more than 50,000 individual entries.

McLain and his colleagues wanted something more targeted. They painstakingly built PteroTerra from the ground up. McLain, as the paleontologist on the project, read published papers on pterosaurs and visited museums to catalog specimens.

"I think we have every species represented, so in that sense, it's pretty complete," he told Live Science. The database does not contain every specimen of pterosaur material ever found — tens of thousands of fossil fragments have been discovered — but McLain hopes to get other paleontologists on board as administrators to upload their specimen data.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Ninety-nine percent of the ocean's plastic is missing

Ninety-nine percent of the ocean's plastic is missing | Amazing Science |

Millions of tons. That’s how much plastic should be floating in the world’s oceans, given our ubiquitous use of the stuff. But a new study finds that 99% of this plastic is missing. One disturbing possibility: Fish are eating it.

If that’s the case, “there is potential for this plastic to enter the global ocean food web,” says Carlos Duarte, an oceanographer at the University of Western Australia, Crawley. “And we are part of this food web.”

Humans produce almost 300 million tons of plastic each year. Most of this ends up in landfills or waste pits, but a 1970s National Academy of Sciences study estimated that 0.1% of all plastic washes into the oceans from land, carried by rivers, floods, or storms, or dumped by maritime vessels. Some of this material becomes trapped in Arctic ice and some, landing on beaches, can even turn into rocks made of plastic. But the vast majority should still be floating out there in the sea, trapped in midocean gyres—large eddies in the center of oceans, like theGreat Pacific Garbage Patch.

To figure out how much refuse is floating in those garbage patches, four ships of the Malaspina expedition, a global research project studying the oceans, fished for plastic across all five major ocean gyres in 2010 and 2011. After months of trailing fine mesh nets around the world, the vessels came up light—by a lot. Instead of the millions of tons scientists had expected, the researchers calculated the global load of ocean plastic to be about only 40,000 tons at the most, the researchers report online today in the Proceedings of the National Academy of Sciences. “We can’t account for 99% of the plastic that we have in the ocean,” says Duarte, the team’s leader.

He suspects that a lot of the missing plastic has been eaten by marine animals. When plastic is floating out on the open ocean, waves and radiation from the sun can fragment it into smaller and smaller particles, until it gets so small it begins to look like fish food—especially to small lanternfish, a widespread small marine fish known to ingest plastic.

“Yes, animals are eating it,” says oceanographer Peter Davison of the Farallon Institute for Advanced Ecosystem Research in Petaluma, California, who was not involved in the study. “That much is indisputable.”

But, he says, it’s hard to know at this time what the biological consequences are. Toxic ocean pollutants like DDT, PCBs, or mercury cling to the surface of plastics, causing them to “suck up all the pollutants in the water and concentrate them.” When animals eat the plastic, that poison could be going into the fish and traveling up the food chain to market species like tuna or swordfish. Or, Davison says, toxins in the fish “may dissolve back into the water … or for all we know they’re puking [the plastic] or pooping it out, and there’s no long-term damage. We just don’t know.”

Eric Chan Wei Chiang's comment, July 8, 2014 3:55 AM
Much of the missing plastics is converted into micro plastics and some of it is consumed by wildlife
Scooped by Dr. Stefan Gruenwald!

Overcoming light scattering: Single-pixel optical system uses compressive sensing to see deeper inside tissue

Overcoming light scattering: Single-pixel optical system uses compressive sensing to see deeper inside tissue | Amazing Science |

Optical imaging methods are rapidly becoming essential tools in biomedical science because they're noninvasive, fast, cost-efficient and pose no health risks since they don't use ionizing radiation. These methods could become even more valuable if researchers could find a way for optical light to penetrate all the way through the body's tissues. With today's technology, even passing through a fraction of an inch of skin is enough to scatter the light and scramble the image.

Now a team of researchers from Spain's Jaume I University (UJI) and the University of València has developed a single-pixel optical system based on compressive sensing that can overcome the fundamental limitations imposed by this scattering. The work was published today in The Optical Society's (OSA) open-access journal Optics Express.

"In the diagnostic realm within the past few years, we've witnessed the way optical imaging has helped clinicians detect and evaluate suspicious lesions," said Jesús Lancis, the paper's co-author and a researcher in the Photonics Research Group at UJI. "The elephant in the room, however, is the question of the short penetration depth of light within tissue compared to ultrasound or x-ray technologies. Current knowledge is insufficient for early detection of small lesions located deeper than a millimeter beneath the surface of the mucosa." "Our goal is to see deeper inside tissue," he added.

To achieve this, the team used an off-the-shelf digital micromirror array from a commercial video projector to create a set of microstructured light patterns that are sequentially superimposed onto a sample. They then measure the transmitted energy with a photodetector that can sense the presence or absence of light, but has no spatial resolution. Then they apply a signal processing technique called compressive sensing, which is used to compress large data files as they are measured. This allows them to reconstruct the image.

One of the most surprising aspects of the team's work is that they use essentially a single-pixel sensor to capture the images. While most people think that more pixels result in better image quality, there are some cases where this isn't true, Lancis said. In low-light imaging, for instance, it's better to integrate all available light into a single sensor. If the light is split into millions of pixels, each sensor receives a tiny fraction of light, creating noise and destroying the image.

Donald Schwartz's curator insight, July 2, 2014 7:33 PM

At least a step in the other direction.

Scooped by Dr. Stefan Gruenwald!

New Species of Beetle Discovered in World's Deepest Cave

New Species of Beetle Discovered in World's Deepest Cave | Amazing Science |

We've been to the moon but we still haven't discovered everything on our own planet yet. An expedition to world’s deepest cave, Krubera-Voronja in Western Caucasus, revealed an interesting subterranean community, living below 2000 meters and represented by more than 12 species of arthropods, including several new species for science. This deep cave biota is composed of troglobionts and also epigean species, that can penetrate until -2140 m. The distance from the base of the Krubera-Voronja system to the top is about the same as the height of seven Eiffel Towers. Ambient temperatures are constantly below seven degrees Celsius and it gets considerably colder the lower you descend. Water temperature is just above freezing.

The biocoenosis and the vertical distribution of invertebrate fauna of Krubera-Voronja are provided, from its entrance to the remarkable depth of 2140 meters, including the discovery of world’s deepest dwelling arthropod.

A new species of ground beetle named Duvalius abyssimus—was recently discovered by scientists exploring the subterranean fauna living up to 1.5 miles below the earth's surface in Krubera-Voronja. The new creature has adapted to a life without light in the world's deepest cave system, with extended antennae and a body that has no pigment. It is about a quarter of an inch long.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Dramatic decline of Caribbean coral reefs: Most corals may disappear within the next 20 years

Dramatic decline of Caribbean coral reefs: Most corals may disappear within the next 20 years | Amazing Science |

With only about one-sixth of the original coral cover left, most Caribbean coral reefs may disappear in the next 20 years, primarily due to the loss of grazers in the region, according to the latest report by the Global Coral Reef Monitoring Network (GCRMN), the International Union for Conservation of Nature (IUCN) and the United Nations Environment Programme (UNEP).

The report, Status and Trends of Caribbean Coral Reefs: 1970-2012, is the most detailed and comprehensive study of its kind published to date – the result of the work of 90 experts over the course of three years. It contains the analysis of more than 35,000 surveys conducted at 90 Caribbean locations since 1970, including studies of corals, seaweeds, grazing sea urchins and fish.

The results show that the Caribbean corals have declined by more than 50% since the 1970s. But according to the authors, restoring parrotfish populations and improving other management strategies, such as protection from overfishing and excessive coastal pollution, could help the reefs recover and make them more resilient to future climate change impacts.

“The rate at which the Caribbean corals have been declining is truly alarming,” says Carl Gustaf Lundin, Director of IUCN’s Global Marine and Polar Programme. “But this study brings some very encouraging news: the fate of Caribbean corals is not beyond our control and there are some very concrete steps that we can take to help them recover.”

Climate change has long been thought to be the main culprit in coral degradation. While it does pose a serious threat by making oceans more acidic and causing coral bleaching, the report shows that the loss of parrotfish and sea urchin – the area’s two main grazers – has, in fact, been the key driver of coral decline in the region. An unidentified disease led to a mass mortality of the sea urchin in 1983 and extreme fishing throughout the 20th century has brought the parrotfish population to the brink of extinction in some regions. The loss of these species breaks the delicate balance of coral ecosystems and allows algae, on which they feed, to smother the reefs.

Peter Phillips's curator insight, July 2, 2014 6:27 PM

Scientists have identified the loss of grazers (parrot fish and sea urchins) as the main reason behind the decline in reef health in the Caribbean. The disruption to the reef ecosystem is now understood to be more important than climate change and ocean acidification to the resilience of coral reefs. Overfishing and a disease which affected sea urchins lead to algal growth which smothers coral. 

Scooped by Dr. Stefan Gruenwald!

New State of Matter Discovered: Quantum Droplets of Electrons and their Holes

New State of Matter Discovered: Quantum Droplets of Electrons and their Holes | Amazing Science |

There was a time when states of matter were simple: Solid, liquid, gas. Then came plasma, Bose -Einstein condensate, supercritical fluid and more. Now the list has grown by one more, with the unexpected discovery of a new state dubbed “dropletons” that bear some resemblance to liquids but occur under very different circumstances.   The discovery occurred when a team at the University of Colorado Joint Institute for Lab Astrophysics were focusing laser light on gallium arsenide (GaAs) to create excitons. 

Interacting many-body systems are characterized by stable configurations of objects—ranging from elementary particles to cosmological formations123—that also act as building blocks for more complicated structures. It is often possible to incorporate interactions in theoretical treatments of crystalline solids by introducing suitable quasiparticles that have an effective mass, spin or charge45 which in turn affects the material’s conductivity, optical response or phase transitions267. Additional quasiparticle interactions may also create strongly correlated configurations yielding new macroscopic phenomena, such as the emergence of a Mott insulator8, superconductivity or the pseudogap phase of high-temperature superconductors910,11. In semiconductors, a conduction-band electron attracts a valence-band hole (electronic vacancy) to create a bound pair, known as an exciton1213, which is yet another quasiparticle. Two excitons may also bind together to give molecules, often referred to as biexcitons14, and even polyexcitons may exist1516. In indirect-gap semiconductors such as germanium or silicon, a thermodynamic phase transition may produce electron–hole droplets whose diameter can approach the micrometre range1718. In direct-gap semiconductors such as gallium arsenide, the exciton lifetime is too short for such a thermodynamic process. Instead, different quasiparticle configurations are stabilized dominantly by many-body interactions, not by thermalization. The resulting non-equilibrium quantum kinetics is so complicated that stable aggregates containing three or more Coulomb-correlated electron–hole pairs remain mostly unexplored.

Researchers now studied such a complex aggregates and identified a new stable configuration of charged particles called a quantum droplet. This configuration exists in a plasma and exhibits quantization owing to its small size. It is charge neutral and contains a small number of particles with a pair-correlation function that is characteristic of a liquid. There is experimental and theoretical evidence for the existence of quantum droplets in an electron–hole plasma created in a gallium arsenide quantum well by ultrashort optical pulses.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Record-breaking magnet crams three tons of force into the size of a golf ball

Record-breaking magnet crams three tons of force into the size of a golf ball | Amazing Science |

University of Cambridge scientists have broken a decade-old superconducting record by packing a 17.6 Tesla magnetic field into a golf ball-sized hunk of crystal -- equivalent to about three tons of force. The team used high temperature superconductors that work at -320 degrees F or so -- not exactly balmy, but less frigid than the -460 degrees F needed for regular superconductors.

With zero resistance, superconducting materials can carry up to 100 times more current than copper wires, but the resulting magnetic fields create huge internal forces. Since the cuprate materials used for the record are as fragile as dried pasta, they can actually explode under the strain. To get around it, the team modified the material's microstructure and "shrink-wrapped" it in stainless steel. That produced the largest magnetic field ever trapped in a standalone material at any temperature, according to the team. The research might eventually lead to more secure and efficient power transmission, better scanners and yes, levitating monorails.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A counterintuitive phenomenon discovered: The coexistence of superconductivity with dissipation

A counterintuitive phenomenon discovered: The coexistence of superconductivity with dissipation | Amazing Science |

For his doctoral dissertation in the Goldman Superconductivity Research Group at the University of Minnesota, Yu Chen, now a postdoctoral researcher at UC Santa Barbara, developed a novel way to fabricate superconducting nanocircuitry. However, the extremely small zinc nanowires he designed did some unexpected things.

Chen, along with his thesis adviser, Allen M. Goldman, and theoretical physicist Alex Kamenev, both of the University of Minnesota, spent years seeking an explanation for these extremely puzzling effects. Their findings appear this week in Nature Physics.

"We were determined to figure out how we could reconcile the strange phenomena with the longstanding rules governing superconductivity," said lead author Chen. "The coexistence of superconductivity with dissipation, which we observed, is counterintuitive and bends the rules as we know them."

Typically superconductivity and dissipation are thought to be mutually exclusive because dissipation, a process in thermodynamic systems whereby electric energy is transformed into heat, is a feature of a normal—versus a superconductive—state.

"But we discovered that superconductivity and dissipation can coexist under rather generic conditions in what appears to be a universal manner," Chen said.

After long and careful work, which involved both experimental and theoretical efforts, the researchers found an explanation that fits. Behind all of the observed phenomena is a peculiar nonequilibrium state of quasiparticles—electron-like excitations that formed in the nanowires Chen designed.

The quasiparticles are created by phase slips. In a superconductive state, when supercurrent flows through the nanowire, the quantum mechanical function describing the superconductivity of the wire evolves along the length of the wire as a spiral shaped like a child's Slinky toy. From time to time, one of the revolutions of the spiral contracts and disappears altogether. This event is called a phase slip. This quirk generates quasiparticles, giving rise to a previously undiscovered voltage plateau state where dissipation and superconductivity coexist.

"The most significant achievement was making the nanowires smaller and cooler than anyone had done previously," Kamenev said. "This allowed the quasiparticles to travel through the wire faster and avoid relaxation. This leads to a peculiar nonthermal state, which combines properties of a superconductor and a normal metal at the same time."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

NASA Tests Flying Saucer-Shaped Vehicle (LDSD) for Future Mars Landing

NASA Tests Flying Saucer-Shaped Vehicle (LDSD) for Future Mars Landing | Amazing Science |

As NASA plans ambitious new robotic missions to Mars, laying the groundwork for even more complex human science expeditions to come, the spacecraft needed to land safely on the red planet's surface necessarily becomes increasingly massive, hauling larger payloads to accommodate extended stays on the Martian surface.

Current technology for decelerating from the high speed of atmospheric entry to the final stages of landing on Mars dates back to NASA's Viking Program, which put two landers on Mars in 1976. The basic Viking parachute design has been used ever since -- and was successfully used again in 2012 to deliver the Curiosity rover to Mars.

NASA seeks to use atmospheric drag as a solution, saving rocket engines and fuel for final maneuvers and landing procedures. The heavier planetary landers of tomorrow, however, will require much larger drag devices than any now in use to slow them down -- and those next-generation drag devices will need to be deployed at higher supersonic speeds to safely land vehicle, crew and cargo. NASA's Low Density Supersonic Decelerator (LDSD) Technology Demonstration Mission, led by NASA's Jet Propulsion Laboratory in Pasadena, Calif., will conduct full-scale, stratospheric tests of these breakthrough technologies high above Earth to prove their value for future missions to Mars.

Three devices are in development. The first two are supersonic inflatable aerodynamic decelerators -- very large, durable, balloon-like pressure vessels that inflate around the entry vehicle and slow it from Mach 3.5 or greater to Mach 2 or lower. These decelerators are being developed in 6-meter-diameter and 8-meter-diameter configurations. Also in development is a 30.5-meter-diameter parachute that will further slow the entry vehicle from Mach 1.5 or Mach 2 to subsonic speeds. All three devices will be the largest of their kind ever flown at speeds several times greater than the speed of sound.

Together, these new drag devices can increase payload delivery to the surface of Mars from our current capability of 1.5 metric tons to 2 to 3 metric tons, depending on which inflatable decelerator is used in combination with the parachute. They will increase available landing altitudes by 2-3 kilometers, increasing the accessible surface area we can explore. They also will improve landing accuracy from a margin of 10 kilometers to just 3 kilometers. All these factors will increase the capabilities and robustness of robotic and human explorers on Mars.

To thoroughly test the system, the LDSD team will fly the drag devices several times -- at full scale and at supersonic speeds -- high in Earth’s stratosphere, simulating entry into the atmosphere of Mars. The investigators are conducting design verification tests of parachutes and supersonic inflatable aerodynamic decelerators through 2013. Supersonic flight tests will be conducted in 2014 and 2015 from the Pacific Missile Range Facility in Barking Sands, HI.

Once tested, the devices will enable missions that maximize the capability of current launch vehicles, and could be used in Mars missions launching as early as 2020.

The LDSD project is sponsored by NASA’s Space Technology Mission Directorate and is managed by the Jet Propulsion Laboratory.

Matt Mayevsky's curator insight, July 1, 2014 4:49 AM

Come on, I would like to see it;)

Scooped by Dr. Stefan Gruenwald!

Bell’s theorem still reverberates: How entanglement makes the impossible possible

Bell’s theorem still reverberates: How entanglement makes the impossible possible | Amazing Science |

Fifty years ago, John Bell made metaphysics testable, but quantum scientists still dispute the implications.

In 1964, Northern Irish physicist John Bell proved mathematically that certain quantum correlations, unlike all other correlations in the Universe, cannot arise from any local cause1. This theorem has become central to both metaphysics and quantum information science. But 50 years on, the experimental verifications of these quantum correlations still have ‘loopholes’, and scientists and philosophers still dispute exactly what the theorem states.

Quantum theory does not predict the outcomes of a single experiment, but rather the statistics of possible outcomes. For experiments on pairs of ‘entangled’ quantum particles, Bell realized that the predicted correlations between outcomes in two well-separated laboratories can be profoundly mysterious (see ‘How entanglement makes the impossible possible’). Correlations of this sort, called Bell correlations,were verified experimentally more than 30 years ago (see, for example, ref. 2). As Bell proved in 1964, this leaves two options for the nature of reality. The first is that reality is irreducibly random, meaning that there are no hidden variables that “determine the results of individual measurements”1. The second option is that reality is ‘non-local’, meaning that “the setting of one measuring device can influence the reading of another instrument, however remote”1.

Most physicists are localists: they recognize the two options but choose the first, because hidden variables are, by definition, empirically inaccessible. Quantum information scientists embrace irreducible randomness as a resource for secure cryptography3. Other physicists and philosophers (the ‘non-localist camp’) dispute that there are two options, and insist that Bell’s theorem mandates non-locality4.

Bell himself was a non-localist, an opinion he first published in 1976 (ref. 6), after introducing a concept, “local causality”, that is subtly different from the locality of the 1964 theorem. Deriving this from Einstein’s principle requires an even stronger notion of causation: if two events are statistically correlated, then either one causes the other, or they have a common cause, which, when taken into account, eliminates the correlation.

In 1976, Bell proved that his new concept of local causality (based implicitly on the principle of common cause), was ruled out by Bell correlations6. In this 1976 theorem there was no second option, as there had been in the 1964 theorem, of giving up hidden variables. Nature violates local causality.

Experiments in 1982 by a team led by French physicist Alain Aspect2, using well-separated detectors with settings changed just before the photons were detected, suffered from an ‘efficiency loophole’ in that most of the photons were not detected. This allows the experimental correlations to be reproduced by (admittedly, very contrived) local hidden variable theories.

In 2013, this loophole was closed in photon-pair experiments using high-efficiency detectors78. But they lacked large separations and fast switching of the settings, opening the ‘separation loophole’: information about the detector setting for one photon could have propagated, at light speed, to the other detector, and affected its outcome.

There are several groups worldwide racing to do the first Bell experiment with large separation, efficient detection and fast switching. It will be a landmark achievement in physics. But would such an experiment really close all the loopholes? The answer depends on one’s attitude to causation.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Individually addressable arrays of replica microbial cultures enabled by splitting SlipChips

Individually addressable arrays of replica microbial cultures enabled by splitting SlipChips | Amazing Science |
A diagnostic tool that's about the size of a credit card has identified a highly prized gut microbe.

Isolating microbes carrying genes of interest from environmental samples is important for applications in biology and medicine. However, this involves the use of genetic assays that often require lysis of microbial cells, which is not compatible with the goal of obtaining live cells for isolation and culture.

A recent development by Caltech describes the design, fabrication, biological validation, and underlying physics of a microfluidic SlipChip device that addresses this challenge. The device is composed of two conjoined plates containing 1000 microcompartments, each comprising two juxtaposed wells, one on each opposing plate. Single microbial cells are stochastically confined and subsequently cultured within the microcompartments. Then, each microcompartment is split into two replica droplets, both containing microbial culture, and then controllably separate the two plates while retaining each droplet within each well. The inventors experimentally describe the droplet retention as a function of capillary pressure, viscous pressure, and viscosity of the aqueous phase. Within each pair of replicas, one can be used for genetic analysis, and the other preserves live cells for growth. This microfluidic approach provides a facile way to cultivate anaerobes from complex communities. The researchers validate this method by targeting, isolating, and culturing Bacteroides vulgatus, a core gut anaerobe, from a clinical sample. To date, this methodology has enabled isolation of a novel microbial taxon, representing a new genus.

This approach could also be extended to the study of other microorganisms and even mammalian cells or tissue samples, and may enable targeted retrieval of solutions in applications including digital PCR, sequencing, single cell analysis, and protein crystallization.

Melanie Patterson's curator insight, July 2, 2014 5:57 PM

This is clever.  Rustem Ismagilov's group (formerly from the UofC) is using their slip chip invention to solve problems - I never expected him to start growing gut microbes.  Great story. 

Scooped by Dr. Stefan Gruenwald!

Oldest case of Down's syndrome from medieval France

Oldest case of Down's syndrome from medieval France | Amazing Science |

The oldest confirmed case of Down's syndrome has been found: the skeleton of a child who died 1500 years ago in early medieval France. According to the archaeologists, the way the child was buried hints that Down's syndrome was not necessarily stigmatized in the Middle Ages.

Down's syndrome is a genetic disorder that delays a person's growth and causes intellectual disability. People with Down's syndrome have three copies of chromosome 21, rather than the usual two. It was described in the 19th century, but has probably existed throughout human history. However there are few cases of Down's syndrome in the archaeological record.

The new example comes from a 5th- and 6th-century necropolis near a church in Chalon-sur-Saône in eastern France. Excavations there have uncovered the remains of 94 people, including the skeleton of a young child with a short and broad skull, a flattened skull base and thin cranial bones. These features are common in people with Down's syndrome, says Maïté Rivollat at the University of Bordeaux in France, who has studied the skeleton with her colleagues.

"I think the paper makes a convincing case for a diagnosis of Down's syndrome," says John Starbuck at Indiana University in Indianapolis. He has just analyzed a 1500-year-old figurine from the Mexican Tolteca culture that he says depicts someone with Down's syndrome.

A similar argument was put forward in a 2011 study that described the 1500-year-old burial in Israel of a man with dwarfism (International Journal of Osteoarchaeology, DOI: 10.1002/oa.1285). The body was buried in a similar manner to others at the site, and archaeologists took that as indicating that the man was treated as a normal member of society.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Global Warming: Failing To Control Earth’s CO2

Global Warming: Failing To Control Earth’s CO2 | Amazing Science |

The whole world gathered in Copenhagen recently for the XV COP for Climatic Changes. On the agenda was how to cope with the rise in CO2 emissions, which, in addition to ocean acidification, could elevate the ocean level as much as 60 cm by the end of the century.

This will jeopardize those living on islands and along shorelines – it’s estimated 100 million people may be menaced. In fact, humans are pumping 7 Gt of COin the atmosphere yearly. The level of COin the atmosphere today is around 370 ppm – according to specialists, it needs to remain below 420 ppm through the end of this century to keep global warming below 2oC. Most solutions to reduce this trend are not short-term ones. An integrated approach to carbon abatement in the automotive sector could reduce global passenger vehicle greenhouse emissions by 2.2 Gt by 2030, much of it using proven technologies. Sugarcane-based ethanol produced in Brazil on 8 million Ha can be substantially increased, but that must be done without harming the environment. The ethanol produced from 200 million tons of corn in the US will help reduce greenhouse emission by car. Together both countries supply today only a fraction of what will be needed to replace the automotive fossil fuel in years to come.

This means abatement will not come from first-generation biofuels alone, but from a combination of second generation biofuel, traffic flow shifts and a mix of several other technologies. Carbon capture and storage can handle a few million metric tons of CO2 /year, while 6 billion metric tons of coal are burned each year, producing 18 billion tons of CO2.

Brazil hopes to revert deforestation in the Amazon that in the last decades claimed an area larger than Germany, according to the National Institute of Air Space – INPE. To accomplish this, the National Plan of Climatic Changes in Brazil was presented in Copenhagen, and it included efforts to achieve reforestation by 2020. This is a costly and long-term effort.

But deforestation is not a problem of the tropical forest alone. The vegetation of other ecosystem have been drastically reduced. There is just 7% of the original vegetation of Mata Atlântica left. The “Cerrado” is being destroyed at a rate of 0.5% a year. Inadequate use of this biome, for ethanol production, for instance, could destroy the 17% remaining of the Cerrado.

So what can be done if the level of CO2 cannot be kept under control? Geo-engineering proposes simulated volcanic eruptions to reduce the planet temperature and the level of ocean rise, based uponobservations made after Mount Pinatubo volcanic eruption in June 1991. The eruption injected 10 Tg S in the stratosphere which caused detectable short-term cooling of the planet. One simulated injection of SO2 as an aerosol precursor equivalent to the Mount Pinatubo eruption every two years would cool the planet and consequently keep the sea level rise below 20 cm for centuries ahead, although the (relatively less deadly) ocean acidification due to CO would persist.

I attended several discussions on this subject where most people accepted this fate, like lambs to the slaughterhouse. I proposed a strategy to desalinize sea water for irrigation or as a source of potable water where water is needed most: arid regions of developing countries. If the ocean level rises at a rate of 6 mm/year and since oceans occupy 360 x 106 million Km2, the amount of water to be desalinized is 2.16 x 1012 m3. Considering that there is at least 10% of arid regions in the planet, this amount of water corresponds to only 14 mm of rain falling in 15 million square km2.

So the amount of desalinized water from ocean rise alone may be insufficient to irrigate adequately large areas. Desalinized water could also be stored in reservoirs and underground aquifers. Potable water is scarce in many regions of the world, particularly in the Sub Sahara. Lack of good quality potable water threatens today the lives of 1.1 billion, according to UNEP worldwide, due to infections resulting from unclean drinking water. Throughout most of the world, the most common contamination of raw water sources is from human sewage and in particular human faecal pathogens and parasites. In 2006,waterborne diseases were estimated to cause 1.8 million deaths each year, while about 1.1 billion people lacked proper drinking water. Thus, it is clear that people in the developing world need to have access to good quality water in sufficient quantity, be able to purify water and distribute it.

Most desalination plants yield around 107  m3 of desalinized water annually, in recent years. Alternative technologies may be needed to allow for desalination of 2.16 x 1012 m3/year, the equivalent to 6mm of ocean rise/year. A project alone that desalinizes water from the Red Sea in Jordan has the capacity to produce 850 million m3 of desalinated water/year. That’s 10 times the yields of the recent past and the cost will be more than $10 billion but will benefit Israel, Jordan and the Palestinian Authority. Under the leadership of the Ministry of Water and Irrigation of Jordan, the project may need to gather funds of close to $40 billion for its complete implementation; this is achievable if additional bidders come aboard.

The project will stand as a symbol of peace and cooperation in the Middle East. One project alone yielding 8.5 times 10 to the eight means 10,000 projects of this magnitude are needed. Ted Levin from the Natural Resources Defense Council says that more than 12,000 desalination plants already supply fresh water in 120 nations, mostly in The Middle East and Caribbean. The market for desalination according to analysts will grow substantially over the next decades.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

More than 99% of drug trials for Alzheimer's disease during the past decade have failed

More than 99% of drug trials for Alzheimer's disease during the past decade have failed | Amazing Science |

There is an urgent need to increase the number of potential therapies being investigated, say US scientists. Only one new medicine has been approved since 2004, they report in the journal Alzheimer's Research & Therapy.

The drug failure rate is troubling and higher than for other diseases such as cancer, says Alzheimer's Research UKDr Jeffrey Cummings, of the Cleveland Clinic Lou Ruvo Center for Brain Health, in Las Vegas, and colleagues, examined a public website that records clinical trials.

Between 2002 and 2012, they found 99.6% of trials of drugs aimed at preventing, curing or improving the symptoms of Alzheimer's had failed or been discontinued. This compares with a failure rate of 81% for cancer drugs.

The failure rate was "especially troubling" given the rising numbers of people with dementia, said Dr Simon Ridley, of Alzheimer's Research UK. "The authors of the study highlight a worrying decline in the number of clinical trials for Alzheimer's treatments in more recent years," he said.

"There is a danger that the high failure rates of trials in the past will discourage pharmaceutical companies from investing in dementia research.

"The only way we will successfully defeat dementia is to continue with high quality, innovative research, improve links with industry and increase investment in clinical trials."

Sandy Spencer's curator insight, July 6, 2014 9:37 AM

This is so discouraging. I know everyone has high hopes of a cure or at least something to slow it down. But our wait goes on--

Scooped by Dr. Stefan Gruenwald!

Scientists discovers Achilles' heel in antibiotic-resistant bacteria

Scientists discovers Achilles' heel in antibiotic-resistant bacteria | Amazing Science |
Scientists at the University of East Anglia have made a breakthrough in the race to solve antibiotic resistance.

New research published today in the journal Nature reveals an Achilles' heel in the defensive barrier which surrounds drug-resistant bacterial cells.

The findings pave the way for a new wave of drugs that kill superbugs by bringing down their defensive walls rather than attacking the bacteria itself. It means that in future, bacteria may not develop drug-resistance at all.

The discovery doesn't come a moment too soon. The World Health Organization has warned that antibiotic-resistance in bacteria is spreading globally, causing severe consequences. And even common infections which have been treatable for decades can once again kill.

Researchers investigated a class of bacteria called 'Gram-negative bacteria' which is particularly resistant to antibiotics because of its cells' impermeable lipid-based outer membraneThis outer membrane acts as a defensive barrier against attacks from the human immune system and antibiotic drugs. It allows the pathogenic bacteria to survive, but removing this barrier causes the bacteria to become more vulnerable and die.

Until now little has been known about exactly how the defensive barrier is built. The new findings reveal how bacterial cells transport the barrier building blocks (called lipopolysaccharides) to the outer surface. Group leader Prof Changjiang Dong, from UEA's Norwich Medical School, said: "We have identified the path and gate used by the bacteria to transport the barrier building blocks to the outer surface. Importantly, we have demonstrated that the bacteria would die if the gate is locked."

"This is really important because drug-resistant bacteria is a global health problem. Many current antibiotics are becoming useless, causing hundreds of thousands of deaths each year.

"The number of super-bugs are increasing at an unexpected rate. This research provides the platform for urgently-needed new generation drugs." Lead author PhD student Haohao Dong said: "The really exciting thing about this research is that new drugs will specifically target the protective barrier around the bacteria, rather than the bacteria itself.

"Because new drugs will not need to enter the bacteria itself, we hope that the bacteria will not be able to develop drug resistance in future."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Soft-Robotics: The robots of the future won't look anything like the Terminator

Soft-Robotics: The robots of the future won't look anything like the Terminator | Amazing Science |

The field of soft robotics has attracted a rush of attention in the last year. Down the road at Harvard, multiple groups are working on soft robotic hands, jumping legs, exosuits, and quadrupeds that can do the limbo. At Worcester Polytechnic Institute's Soft Robotics Lab, researchers are building a snake. In San Francisco, a startup called Otherlab is buildinginflatable robots that can shake hands, walk, and carry riders. In Italy, a group of researchers built a robotic tentacle modeled after an octopus.

Before the 1970s, car companies made cars safer by making them larger and heavier. Then along came the airbag: a lightweight safety device that folded up invisibly into the vehicle until it sensed a crash. Similar revolutions took place with body armor, bridges, and contact lenses, and these researchers believe something similar is happening with robots.

"It’s not a part of conventional robotics technologies," says Fumiya Iida, a professor of bio-inspired robotics at the Swiss Federal Institute of Technology-Zurich and a member of the IEEE committee on soft robotics. "They have to think completely differently, use different materials, different energy sources. Definitely this is the way we should go in the long run." One of the most impressive rigid robots in the world right now is Boston Dynamics’ 300-pound humanoid Atlas. If Atlas wants to pick up a ball, it needs to sense and compute the precise distance between its digits and the ball and figure out exactly where to place its hand and how much pressure to apply.

Robots like Atlas "are doing a lot of thinking," says Barry Trimmer, PhD, a professor at Tufts and the editor of a new journal, Soft Robotics, which launched last month. "There’s a lot of hesitancy. ‘Where do I put my foot next?’ Animals just don't do that. We need to get away from the idea that you have to control every variable."

By contrast, Harvard’s starfish-shaped soft gripper only needs to be told to inflate. As it’s pumped full of air, it conforms to the shape of an object until its "fingers" have enough pressure to lift it. Another example would be a human picking up a glass of water. We don’t have to compute the exact size and shape of the glass with our brains; our hand adapts to the object. Similarly, Bubbles doesn’t calculate the full length of its movement.

There are technological challenges as well. In addition to air and fluid pressure actuators, soft roboticists are experimenting with dielectric elastomers, elastic materials that expand and contract in response to electric voltage; shape-memory alloys, metal alloys that can be programmed to change shape at certain temperatures; and springs that respond to light. These approaches are still rudimentary, as are the control systems that operate the robots. In the case of many of Harvard’s soft robots, it’s simply a syringe of air attached to a tube.

The field is so new, however, that no possibilities have yet been ruled out. Soft robotics technologies could theoretically be used in a wearable pair of human wings.More practically, soft robots could easily pack eggs or pick fruit — traditional hard robots, equipped with superhuman grips, are more likely to break yolks and inadvertently make applesauce. A mass of wormlike "meshworm" robots could be filled with water and dropped over a disaster area, where they would crawl to survivors. A soft robotic sleeve could be worn to eliminate tremors or supplement strength lost with age. Soft robots could be used in space exploration, where weight is hugely important; in prosthetics, where they would provide comfort and lifelikeness; in the home, where they can help out around the house without trampling the dog; and in surgical robots, where operators have inspired a few lawsuits after puncturing patients’ insides.

Rudolf Kabutz's curator insight, July 3, 2014 6:38 AM

Do robots have to be hard and metallic? Soft spongy robots could have many advantages.

Anne Pascucci, MPA, CRA's curator insight, July 3, 2014 8:44 AM

Very cool!

Shaeid Chodhhury's comment, April 13, 2017 4:14 AM
I read a best review about best case trimmer
Scooped by Dr. Stefan Gruenwald!

19th Century Jacobi Math Tactic Gets a Makeover—and Yields Answers Up to 200 Times Faster

19th Century Jacobi Math Tactic Gets a Makeover—and Yields Answers Up to 200 Times Faster | Amazing Science |

A relic from long before the age of supercomputers, the 169-year-old math strategy called the Jacobi iterative method is widely dismissed today as too slow to be useful. But thanks to a curious, numbers-savvy Johns Hopkins engineering student and his professor, it may soon get a new lease on life.

With just a few modern-day tweaks, the researchers say they've made the rarely used Jacobi method work up to 200 times faster. The result, they say, could speed up the performance of computer simulations used in aerospace design, shipbuilding, weather and climate modeling, biomechanics and other engineering tasks.

Their paper describing this updated math tool was published June 27 in the online edition of the Journal of Computational Physics. 

"For people who want to use the Jacobi method in computational mechanics, a problem that used to take 200 days to solve may now take only one day," said Rajat Mittal, a mechanical engineering professor in the university's Whiting School of Engineering and senior author of the journal article. "Our paper provides the recipe for how to speed up this method significantly by just changing four or five lines in the computer code."

Simulation data showing significantly faster reduction in solution error for the new Scheduled Relaxation Jacobi (SRJ) method as compared to the classical Jacobi and Gauss-Seidel iterative methods.The equation that is being solved here is the two-dimensional Laplace equation on a 128x128 grid.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

FOXO1: Single gene switch to convert human gastrointestinal cells to insulin-producing cells

FOXO1: Single gene switch to convert human gastrointestinal cells to insulin-producing cells | Amazing Science |

By switching off a single gene, scientists have converted human gastrointestinal cells into insulin-producing cells, demonstrating in principle that a drug could retrain cells inside a person’s GI tract to produce insulin. The finding raises the possibility that cells lost in type 1 diabetes may be more easily replaced through the reeducation of existing cells than through the transplantation of new cells created from embryonic or adult stem cells. The new research was reported in the online issue of the journal Nature Communications.

"People have been talking about turning one cell into another for a long time, but until now we hadn't gotten to the point of creating a fully functional insulin-producing cell by the manipulation of a single target," said the study's senior author, Domenico Accili, MD, the Russell Berrie Foundation Professor of Diabetes (in Medicine) at Columbia University Medical Center (CUMC).

The finding raises the possibility that cells lost in type 1 diabetes may be more easily replaced through the reeducation of existing cells than through the transplantation of new cells created from embryonic or adult stem cells.

For nearly two decades, researchers have been trying to make surrogate insulin-producing cells for type 1 diabetes patients. In type 1 diabetes, the body's natural insulin-producing cells are destroyed by the immune system.

Although insulin-producing cells can now be made in the lab from stem cells, these cells do not yet have all the functions of naturally occurring pancreatic beta cells.

This has led some researchers to try instead to transform existing cells in a patient into insulin-producers. Previous work by Dr. Accili's lab had shown that mouse intestinal cells can be transformed into insulin-producing cells; the current Columbia study shows that this technique also works in human cells.

The Columbia researchers were able to teach human gut cells to make insulin in response to physiological circumstances by deactivating the cells' FOXO1 gene. Accili and postdoctoral fellow Ryotaro Bouchi first created a tissue model of the human intestine with human pluripotent stem cells. Through genetic engineering, they then deactivated any functioning FOXO1 inside the intestinal cells. After seven days, some of the cells started releasing insulin and, equally important, only in response to glucose.

The team had used a comparable approach in its earlier, mouse study. In the mice, insulin made by gut cells was released into the bloodstream, worked like normal insulin, and was able to nearly normalize blood glucose levels in otherwise diabetic mice: New Approach to Treating Type I Diabetes? Columbia Scientists Transform Gut Cells into Insulin Factories. That work, which was reported in 2012 in the journal Nature Genetics, has since received independent confirmation from another group.

Peter Phillips's curator insight, July 2, 2014 6:43 PM

New hope for diabetics - without a transplant.

Eric Chan Wei Chiang's curator insight, July 13, 2014 10:08 AM

These findings indicate that gastrointestinal cells and insulin producing β cells in the pancreas probably differentiated from the same line of cells during development. Insulin production in gastrointestinal cells is probably deactivated by the FOXO1 gene.


This opens up new possibilities as there is already a proof of concept for treating HIV with induced pluripotent stem cells.

Scooped by Dr. Stefan Gruenwald!

2030: Giant space telescope could detect hints ot life on exoplanets

2030: Giant space telescope could detect hints ot life on exoplanets | Amazing Science |

Advanced Technologies Large Aperture Space Telescope (ATLAST) — a giant telescope in space 20 meters across that could give scientists a good chance of detecting hints of life on exoplanets (planets around other stars) has been proposed by U.S. and European scientists — has been proposed by U.S. and European scientists.

In a recent talk at the National Astronomy Meeting (NAM 2014), Prof. Martin Barstow of the University of Leicester and President of the Royal Astronomical Society called for governments and space agencies around the world to back the ambitious project.

The telescope would be capable of analyzing the light from planets the size of the Earth in orbit around other nearby stars, searching for features in their spectra such as molecular oxygen, ozone, water and methane that could suggest the presence of life. It might also be able to see how the surfaces of planets change with the seasons.

ATLAST would study star and galaxy formation at high resolution, constructing the history of star birth in detail and establishing how intergalactic matter was and is assembled into galaxies over billions of years.

If it goes ahead, ATLAST could be launched around 2030. Before this can happen, there are technical challenges to overcome, such as enhancing the sensitivities of detectors and increasing the efficiencies of the coatings on the mirror segments. Such a large structure may also need to be assembled in space before deployment rather than launching on a single rocket. All of this means that a decision to construct the telescope needs to happen soon for it to go ahead.

In the nearly 25 years since the launch of the Hubble Space Telescope (HST), astronomers and the public alike have enjoyed ground-breaking views of the cosmos and the suite of scientific discoveries that followed. The successor to HST, the James Webb Space Telescope should launch in 2018 but will have a comparatively short lifetime.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Cisco Projects the Internet of Things to be a 14 Trillion Dollar Industry

Cisco Projects the Internet of Things to be a 14 Trillion Dollar Industry | Amazing Science |

Twenty years ago, there were about 3 million devices connected to the Internet. By the end of this decade, Gartner estimates that there will be 26 billion devices on the global network.

This can only mean one thing: We’re living in the Internet of Things.

With anything and everything — including trees, insects, pill bottles, and sinks — going online, Cisco projects the Internet of Things to be a $14 trillion revenue opportunity.Helping people remember their daily medicine with light-up bottle caps and preventing illegal logging and monitoring traffic in real-time are worthwhile goals. But they are point solutions. They don’t resonate in our lives in ways that make it impossible to imagine how we lived without them.

In order for the Internet of Things to truly work, context about ourselves (think interests, location, intent) is required. Here are some reasons why the Internet of Things will only come to fruition in identity is incorporated into the user experience.

A wristband that measures your steps and heart rate is a helpful fitness tool. However, if that’s all it does, then it is nothing more than a tool, no matter how many other devices it can connect to. But what about a wristband that knows the wearer’s identity, understands his fitness routines, and tells the treadmill to speed up or slow down based on the wearer’s heart rate and exercise goals?

This sort of personalized connectedness delivers true value and breeds customer loyalty by tapping into each user’s unique situation and background. And it all starts with a deep understanding of of users’ identities.

In the Internet of Things, devices need to participate in a constant conversation with one another, their owners’ social feeds, and outside field of interest. Any device which relies on a one-time data dump will quickly become irrelevant. Connected devices need to be able not only to verify identity but also be flexible enough to grow and adapt as new channels and data points emerge.

The Internet of Things should model the kind of tailored, identity-driven recommendations today’s consumer is accustomed to receiving from leading brands like Amazon, Netflix and Spotify. To compare and contrast, let’s say you have a refrigerator that reorders eggs when you run out. Such a feature would be helpful, but likely would not deliver enough value to gain widespread adoption. However, if your refrigerator automatically shops for a recipe you just pinned, and recommends three new options for dinner based on what you have in the house and your dinner party, that’s extraordinary.

Matt Mayevsky's curator insight, July 1, 2014 4:46 AM

Paradigm shift from clouding to Internet of Things

Marc Kneepkens's curator insight, July 3, 2014 10:08 AM

Huge acceleration in the tech world. Jobs are being created as we speak, startups have a long ways to go still, many more are needed.

Siegfried Holle's curator insight, July 4, 2014 8:40 AM

We are moving to an age of personal abundance .It is an exciting time .

Scooped by Dr. Stefan Gruenwald!

New germ-killing nanosurface opens up new front in hygiene

New germ-killing nanosurface opens up new front in hygiene | Amazing Science |
Imagine a hospital room, door handle or kitchen countertop that is free from bacteria—and not one drop of disinfectant or boiling water or dose of microwaves has been needed to zap the germs.

That is the idea behind a startling discovery made by scientists in Australia.

In a study published on Tuesday in the journal Nature Communications, they described how a dragonfly led them to a nano-tech surface that physically slays bacteria.

The germ-killer is black silicon, a substance discovered accidentally in the 1990s and now viewed as a promising semiconductor material for solar panels.

Under an electron microscope, its surface is a forest of spikes just 500 nanometres (500 billionths of a metre) high that rip open the cell walls of any bacterium which comes into contact, the scientists found. It is the first time that any water-repellent surface has been found to have this physical quality as bactericide.

Last year, the team, led by Elena Ivanova at Swinburne University of Technology in Melbourne, were stunned to find cicada wings were potent killers of Pseudomonas aeruginsoa—an opportunist germ that also infects humans and is becoming resistant to antibiotics.

Looking closely, they found that the answer lay not in any biochemical on the wing, but in regularly-spaced "nanopillars" on which bacteria were sliced to shreds as they settled on the surface. They took the discovery further by examining nanostructures studding the translucent forewings of a red-bodied Australian dragonfly called the wandering percher (Latin name Diplacodes bipunctata). It has spikes that are somewhat smaller than those on the black silicon—they are 240 nanometres high.

The dragonfly's wings and black silicon were put through their paces in a lab, and both were ruthlessly bactericidal. Smooth to the human touch, the surfaces destroyed two categories of bacteria, called Gram-negative and Gram-positive, as well as spores, the protective shell that coats certain times of dormant germs.

The three targeted bugs comprised P. aeruginosa, the notorious Staphylococcus aureus and the ultra-tough spore of Bacillus subtilis, a wide-ranging soil germ that is a cousin of anthrax. The killing rate was 450,000 bacterial cells per square centimetre per minute over the first three hours of exposure. This is 810 times the minimum dose needed to infect a person with S. aureus, and a whopping 77,400 times that of P. aeruginosa.

Peter Phillips's curator insight, July 2, 2014 6:48 PM

Learning from nature - dragon fly wings. Minute structures on their surface pop bacteria like balloons. Opening possibilities to reduce antibiotic use in hospitals.

Scooped by Dr. Stefan Gruenwald!

Rate of deforestation in Indonesia overtakes Brazil

Rate of deforestation in Indonesia overtakes Brazil | Amazing Science |

Indonesia lost 840,000 hectares of forest in 2012 compared to 460,000 hectares in Brazil, despite its forest being a quarter the size of the Amazon rainforest.

Indonesia has greatly under-reported how much primary rainforest it is cutting down, according to the government's former head of forestry data gathering.

UN and official government figures have maintained that the country with the third biggest stretch of tropical forest after the Amazon and Congo was losing 310,00 hectares of all its forest a year between 2000 and 2005, increasing to 690,000 hectares annually from 2006 to 2010.

Exact rates of Indonesian deforestation have varied with different figures quoted by researchers and government, but a new study, which claims to be the most comprehensive yet, suggests that nearly twice as much primary forest is being cut down as in Brazil, the historical global leader.

Belinda Arunarwati Margono, who was in charge of data gathering at Indonesia's Ministry of Forestry for seven years and is now on secondment at South Dakota university, calculates that nearly 1m extra hectares of primary forest may have been felled in the last 12 years than was recorded officially.

In the paper in the journal Nature Climate Change published on Sunday, Margano says primary forest losses totalled 6.02m hectates between 2000 and 2012, increasing by around 47,600 hectares a year over this time. Because previous estimates of forest loss have included the clearing of pulp plantations and oil palm estates the real loss of primary forest has until now been obscured.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Cosmic Journeys: 50 years of space exploratation

Cosmic Journeys: 50 years of space exploratation | Amazing Science |

Humans have traveled to the far corners of the solar system through the eyes of robotic explorers – spacecrafts, probes, and rovers that have sent back progressively more astonishing data and images. The colored lines illustrate nearly 200 unmanned missions since 1958: flybys, orbits, soft landings, and intentional crashes, as well as some of the failures. No human has left low Earth orbit since 1972, when Apollo 17 made the last of NASA's 9 manned missions to the moon. But odds are we will. A privately funded mission aims to have a man and a woman to circle Mars as early as 2018.

No comment yet.