Amazing Science
843.3K views | +42 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Machine learning rivals human skills in cancer detection

Machine learning rivals human skills in cancer detection | Amazing Science |

Two announcements yesterday (April 21) suggest that deep learning algorithms rival human skills in detecting cancer from ultrasound images and other sources.


Samsung Medison, a global medical equipment company and an affiliate of Samsung Electronics, has just updated its RS80A ultrasound imaging system with a deep learning algorithm for breast-lesion analysis. The “S-Detect for Breast” feature uses big data collected from breast-exam cases and recommends whether the selected lesion is benign or malignant. It’s used in in lesion segmentation, characteristic analysis, and assessment processes, providing “more accurate results.”


“We saw a high level of conformity from analyzing and detecting lesion in various cases by using the S-Detect,” said professor Han Boo Kyung, a radiologist at Samsung Medical Center. “Users can reduce taking unnecessary biopsies and doctors-in-training will likely have more reliable support in accurately detecting malignant and suspicious lesions.”


Meanwhile, researchers from the Regenstrief Institute and Indiana University School of Informatics and Computing at Indiana University-Purdue University Indianapolis say they’ve found that open-source machine learning tools are as good as — or better than — humans in extracting crucial meaning from free-text (unstructured) pathology reports and detecting cancer cases. The computer tools are also faster and less resource-intensive. U.S. states require cancer cases to be reported to statewide cancer registries for disease tracking, identification of at-risk populations, and recognition of unusual trends or clusters. This free-text information can be difficult for health officials to interpret, which can further delay health department action, when action is needed.


“We think that its no longer necessary for humans to spend time reviewing text reports to determine if cancer is present or not,” said study senior author Shaun Grannis*, M.D., M.S., interim director of the Regenstrief Center of Biomedical Informatics.


“We have come to the point in time that technology can handle this. A human’s time is better spent helping other humans by providing them with better clinical care. Everything — physician practices, health care systems, health information exchanges, insurers, as well as public health departments — are awash in oceans of data. How can we hope to make sense of this deluge of data? Humans can’t do it — but computers can.”

This is especially relevant for underserved nations, where a majority of clinical data is collected in the form of unstructured free text, he said.


The researchers sampled 7,000 free-text pathology reports from over 30 hospitals that participate in the Indiana Health Information Exchange and used open source tools, classification algorithms, and varying feature selection approaches to predict if a report was positive or negative for cancer. The results indicated that a fully automated review yielded results similar or better than those of trained human reviewers, saving both time and money.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Water molecules confined in nanochannels exhibit quantum tunneling behavior

Water molecules confined in nanochannels exhibit quantum tunneling behavior | Amazing Science |
Water molecules confined in nanochannels exhibit tunneling behavior that smears out the positions of the hydrogen atoms into a pair of corrugated rings.


Tunneling is a quantum effect that lets particles go through microscopic barriers in a single bound. A study of water trapped in an emerald-like crystal reveals tunneling of water molecules among multiple orientations, so that each molecule is essentially in six configurations at once. The researchers showed with neutron scattering experiments that the tunneling causes the water’s hydrogen atoms to spread out into ring-like distributions. This new form of water is a more symmetric structure that is predicted to have zero electric dipole moment—the property that normally allows water to form hydrogen bonds and perform well as a solvent.


Tunneling occurs when an object traverses a barrier without having enough energy to do so classically. Certain molecules can tunnel among rotational orientations. A representative example is the methyl group (CH3)(CH3), which is a carbon atom bound to three hydrogens in a symmetric pyramid configuration. Electric forces from nearby atoms generate repulsion that resists any rotation around the pyramid axis. However, the hydrogens can tunnel through these barriers from one pyramid corner to the next. This discrete hopping couples together rotational orientations, causing an observable splitting of the ground state into multiple levels with slightly different energies.


 Recently, optical spectroscopy revealed energy splitting in the terahertz spectrum of water molecules in the gemstone beryl, suggesting that the molecule is hopping among multiple states [1]. The crystal structure of beryl(Be3Al2Si6O18)(Be3Al2Si6O18) contains channels with hexagonal cross-sections that can trap water molecules. The channels periodically narrow into “cages” roughly 0.5 nanometers wide by 0.9 nanometers long and only big enough for one water molecule. The previously observed splitting suggested that the confined water was rotationally tunneling inside the channels, but a more direct test was necessary. Now Alexander Kolesnikov from Oak Ridge National Laboratory (ORNL) in Tennessee and his colleagues have performed a series of neutron scattering measurements on a beryl sample containing water.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Health and Biomedical Informatics!

The role of big data in medicine

The role of big data in medicine | Amazing Science |

Technology is revolutionizing our understanding and treatment of disease, says the founding director of the Icahn Institute for Genomics and Multiscale Biology at New York’s Mount Sinai Health System.


The role of big data in medicine is one where we can build better health profiles and better predictive models around individual patients so that we can better diagnose and treat disease.


One of the main limitations with medicine today and in the pharmaceutical industry is our understanding of the biology of disease. Big data comes into play around aggregating more and more information around multiple scales for what constitutes a disease—from the DNA, proteins, and metabolites to cells, tissues, organs, organisms, and ecosystems. Those are the scales of the biology that we need to be modeling by integrating big data. If we do that, the models will evolve, the models will build, and they will be more predictive for given individuals.


It’s not going to be a discrete event—that all of a sudden we go from not using big data in medicine to using big data in medicine. I view it as more of a continuum, more of an evolution. As we begin building these models, aggregating big data, we’re going to be testing and applying the models on individuals, assessing the outcomes, refining the models, and so on. Questions will become easier to answer. The modeling becomes more informed as we start pulling in all of this information. We are at the very beginning stages of this revolution, but I think it’s going to go very fast, because there’s great maturity in the information sciences beyond medicine.


The life sciences are not the first to encounter big data. We have information-power companies like Google and Amazon and Facebook, and a lot of the algorithms that are applied there—to predict what kind of movie you like to watch or what kind of foods you like to buy—use the same machine-learning techniques. Those same types of methods, the infrastructure for managing the data, can all be applied in medicine.

Via fjms
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Physics: Let's unite to build a quantum Internet

Physics: Let's unite to build a quantum Internet | Amazing Science |

One of the greatest challenges for implementing a globally distributed quantum computer or a quantum internet is entangling nodes across the network8. Qubits can then be teleported between any pair and processed by local quantum computers.


Ideally, nodes should be entangled either in pairs or by creating a large, multi-entangled 'cluster state' that is broadcast to all nodes. Cluster states that link thousands of nodes have already been created in the laboratory9. The challenges are to demonstrate how they might be deployed over long distances, as well as how to store quantum states at the nodes and update them constantly using quantum codes.


Quantum networks require memories to store quantum information, ideally for hours — shielding it from unwanted interactions with the environment. Such memories are needed for quantum computing at nodes and also for the faithful, long-distance distribution of entanglement through quantum repeaters.


Quantum memories need to convert electromagnetic radiation into physical changes in matter with near-perfect read–write fidelity and at high capacity. 'Spin ensembles' represent one type of quantum memory. Ultracold atomic gases consisting of about one million atoms of rubidium can convert a single photon into a collective atomic excitation known as a spin wave. Storage times are approaching the 100 milliseconds required to transmit an optical signal across the world.


Solid-state quantum memories are even more appealing. Crystalline-solid spin ensembles — created by inserting lattice defects known as nitrogen-vacancy centres into diamonds, or by doping rare-earth crystals — can remain coherent for hours at cryogenic temperatures.


Superconducting qubits, which are defined by physical quantities such as the charge of a capacitor or the flux of an inductor, interact within a quantum processor by releasing and absorbing microwave photons. For the successful integration of solid-state quantum memory, reversible storage and retrieval of quantum information must be made possible. This will require an efficient interface between the microwave photons and the atomic spins of a solid-state quantum memory that is attached to the processor. If successful, this hybrid technology would become the most promising architecture to be scaled up into a large, distributed quantum computer.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Scientists crack brain “Enigma code” showing how two brain regions talk to each other

Scientists crack brain “Enigma code” showing how two brain regions talk to each other | Amazing Science |

Researchers at the University of Glasgow have discovered what two parts of the brain are saying to one another when processing visual images. Until now, scientists have only been able to tell whether two parts of the brain are communicating with each other.


The research marks a huge ‘step up’ in interpreting brain activity, opening up range of opportunities such as studying what happens to the brain’s network as it ages or the effects of a stroke when brain processes are disrupted. It also raises the possibility of future research into ‘robot vision’.


Philippe Schyns, professor of psychology at the university’s centre for cognitive neuroimaging, said: “With Enigma, we knew the Germans were communicating, but we didn’t know what they were saying. Just like if you’re walking down the street and you see two people talking in the distance: you know they are communicating with each other, but you don’t know what they are saying.


“Communication between brain regions has so far been like these examples: we know it’s happening, but we don’t know what it’s about. Through our research, we have been able to ‘break the code,’ so to speak, and therefore glean what two parts of the brain are saying to each other.”


READ MORE: Brain study shows how people actively forget


No comment yet.
Scooped by Dr. Stefan Gruenwald!

Sophisticated ‘Mini-Brains’ Add to Evidence of Zika’s Toll on Fetal Cortex

Sophisticated ‘Mini-Brains’ Add to Evidence of Zika’s Toll on Fetal Cortex | Amazing Science |

Studying a new type of pinhead-size, lab-grown brain made with technology first suggested by three high school students, Johns Hopkins researchers have confirmed a key way in which Zika virus causes microcephaly and other damage in fetal brains: by infecting specialized stem cells that build its outer layer, the cortex.


The lab-grown mini-brains, which researchers say are truer to life and more cost-effective than similar research models, came about thanks to the son of two Johns Hopkins scientists and two other high school students who were doing summer research internships. They had the idea to make the equipment for growing the mini-brains with a 3-D printer. These so-called bioreactors, and the mini-brains they foster, should open other new and valuable windows into human brain development, brain disorders and drug testing -- and perhaps even produce neurons for treatment of Parkinson's disease and other disorders, the investigators say. A report on the research appears online April 22 in the journal Cell.


"We have been working for three years to develop a better research model of brain development, and it's fortunate we can now use this one to shed light on the major public health crisis posed by Zika infections," says Hongjun Song, Ph.D., professor of neurology and neuroscience at the Johns Hopkins University School of Medicine's Institute for Cell Engineering. "This more realistic, 3-D model confirms what we suspected based on what we saw in a two-dimensional cell culture: that Zika causes microcephaly -- abnormally small brains and heads -- mainly by attacking the neural progenitor cells that build the brain and turning them into virus factories."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The Universe, where space-time becomes discrete

The Universe, where space-time becomes discrete | Amazing Science |
A theoretical study has analyzed a model that saves special relativity and reconciles it with granularity by introducing small-scale deviations from the principle of locality demonstrating that it can be experimentally tested with great precision.


Our experience of space-time is that of a continuous object, without gaps or discontinuities, just as it is described by classical physics. For some quantum gravity models however, the texture of space-time is "granular" at tiny scales (below the so-called Planck scale, 10-33 cm), as if it were a variable mesh of solids and voids (or a complex foam). One of the great problems of physics today is to understand the passage from a continuous to a discrete description of spacetime: is there an abrupt change or is there gradual transition? Where does the change occur?


The separation between one world and the other creates problems for physicists: for example, how can we describe gravity -- explained so well by classical physics -- according to quantum mechanics? Quantum gravity is in fact a field of study in which no consolidated and shared theories exist as yet. There are, however, "scenarios," which offer possible interpretations of quantum gravity subject to different constraints, and which await experimental confirmation or confutation.


One of the problems to be solved in this respect is that if space-time is granular beyond a certain scale it means that there is a "basic scale," a fundamental unit that cannot be broken down into anything smaller, a hypothesis that clashes with Einstein's theory of special relativity. Imagine holding a ruler in one hand: according to special relativity, to an observer moving in a straight line at a constant speed (close to the speed of light) relative to you, the ruler would appear shorter. But what happens if the ruler has the length of the fundamental scale? For special relativity, the ruler would still appear shorter than this unit of measurement. Special relativity is therefore clearly incompatible with the introduction of a basic graininess of spacetime. Suggesting the existence of this basic scale, say the physicists, means to violate Lorentz invariance, the fundamental tenet of special relativity.


So how can the two be reconciled? Physicists can either hypothesize violations of Lorentz invariance, but have to satisfy very strict constraints (and this has been the preferred approach so far), or they must find a way to avoid the violation and find a scenario that is compatible with both granularity and special relativity. This scenario is in fact implemented by some quantum gravity models such as String Field Theory and Causal Set Theory. The problem to be addressed, however, was how to test their predictions experimentally given that the effects of these theories are much less apparent than are those of the models that violate special relativity. One solution to this impasse has now been put forward by Stefano Liberati, SISSA professor, and colleagues in their latest publication. The study was conducted with the participation of researchers from the LENS in Florence (Francesco Marin and Francesco Marino) and from the INFN in Padua (Antonello Ortolan). Other SISSA scientists taking part in the study, in addition to Liberati, were PhD student Alessio Belenchia and postdoc Dionigi Benincasa. The research was funded by a grant of the John Templeton Foundation.

"We respect Lorentz invariance, but everything comes at a price, which in this case is the introduction of non-local effects," comments Liberati. The scenario studied by Liberati and colleagues in fact salvages special relativity but introduces the possibility that physics at a certain point in space-time can be affected by what happens not only in proximity to that point but also at regions very far from it. "Clearly we do not violate causality nor do we presuppose information that travels faster than light," points out the scientist. "We do, however, introduce a need to know the global structure so as to understand what's going on at a local level."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The black-hole collision that reshaped physics

The black-hole collision that reshaped physics | Amazing Science |
A momentous signal from space has confirmed decades of theorizing on black holes — and launched a new era of gravitational-wave astronomy.


The event was catastrophic on a cosmic scale — a merger of black holes that violently shook the surrounding fabric of space and time, and sent a blast of space-time vibrations known as gravitational waves rippling across the Universe at the speed of light. But it was the kind of calamity that physicists on Earth had been waiting for. On 14 September, when those ripples swept across the freshly upgraded Laser Interferometer Gravitational-Wave Observatory (Advanced LIGO), they showed up as spikes in the readings from its two L-shaped detectors in Louisiana and Washington state. For the first time ever, scientists had recorded a gravitational-wave signal.


“There it was!” says LIGO team member Daniel Holz, an astrophysicist at the University of Chicago in Illinois. “And it was so strong, and so beautiful, in both detectors.” Although the shape of the signal looked familiar from the theory, Holz says, “it's completely different when you see something in the data. It's this transcendent moment”.


The signal, formally designated GW150914 after the date of its occurrence and informally known to its discoverers as 'the Event', has justly been hailed as a milestone in physics. It has provided a wealth of evidence for Albert Einstein's century-old general theory of relativity, which holds that mass and energy can warp space-time, and that gravity is the result of such warping. Stuart Shapiro, a specialist in computer simulations of relativity at the University of Illinois at Urbana–Champaign, calls it “the most significant confirmation of the general theory of relativity since its inception”.


But the Event also marks the start of a long-promised era of gravitational-wave astronomy. Detailed analysis of the signal has already yielded insights into the nature of the black holes that merged, and how they formed. With more events such as these — the LIGO team is analysing several other candidate events captured during the detectors' four-month run, which ended in January — researchers will be able to classify and understand the origins of black holes, just as they are doing with stars.


Still more events should appear starting in September, when Advanced LIGO is scheduled to begin joint observations with its European counterpart, the Franco–Italian-led Advanced Virgo facility near Pisa, Italy. (The two collaborations already pool data and publish papers together.) This detector will not only contribute crucial details to events, but could also help astronomers to make cosmological-distance measurements more accurately than before.


“It's going to be a really good ride for the next few years,” says Bruce Allen, managing director of the Max Planck Institute for Gravitational Physics in Hanover, Germany. “The more black holes they see whacking into each other, the more fun it will be,” says Roger Penrose, a theoretical physicist and mathematician at the University of Oxford, UK, whose work in the 1960s helped to lay the foundation for the theory of the objects. “Suddenly, we have a new way of looking at the Universe.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Approaching electronic DNA circuits: Making precise graphene pattern with DNA

Approaching electronic DNA circuits: Making precise graphene pattern with DNA | Amazing Science |

DNA’s unique structure is ideal for carrying genetic information, but scientists have recently found ways to exploit this versatile molecule for other purposes: By controlling DNA sequences, they can manipulate the molecule to form many different nanoscale shapes.


Chemical and molecular engineers at MIT and Harvard University have now expanded this approach by using folded DNA to control the nanostructure of inorganic materials. After building DNA nanostructures of various shapes, they used the molecules as templates to create nanoscale patterns on sheets of graphene. This could be an important step toward large-scale production of electronic chips made of graphene, a one-atom-thick sheet of carbon with unique electronic properties.

“This gives us a chemical tool to program shapes and patterns at the nanometer scale, forming electronic circuits, for example,” says Michael Strano, a professor of chemical engineering at MIT and a senior author of a paper describing the technique in the April 9 issue of Nature Communications.


Peng Yin, an assistant professor of systems biology at Harvard Medical School and a member of Harvard’s Wyss Institute for Biologically Inspired Engineering, is also a senior author of the paper, and MIT postdoc Zhong Jin is the lead author. Other authors are Harvard postdocs Wei Sun and Yonggang Ke, MIT graduate students Chih-Jen Shih and Geraldine Paulus, and MIT postdocs Qing Hua Wang and Bin Mu.


Most of these DNA nanostructures are made using a novel approach developed in Yin’s lab. Complex DNA nanostructures with precisely prescribed shapes are constructed using short synthetic DNA strands called single-stranded tiles. Each of these tiles acts like an interlocking toy brick and binds with four designated neighbors. Using these single-stranded tiles, Yin’s lab has created more than 100 distinct nanoscale shapes, including the full alphabet of capital English letters and many emoticons. These structures are designed using computer software and can be assembled in a simple reaction. Alternatively, such structures can be constructed using an approach called DNA origami, in which many short strands of DNA fold a long strand into a desired shape.


However, DNA tends to degrade when exposed to sunlight or oxygen, and can react with other molecules, so it is not ideal as a long-term building material. “We’d like to exploit the properties of more stable nanomaterials for structural applications or electronics,” Strano says. Instead, he and his colleagues transferred the precise structural information encoded in DNA to sturdier graphene. The chemical process involved is fairly straightforward, Strano says: First, the DNA is anchored onto a graphene surface using a molecule called aminopyrine, which is similar in structure to graphene. The DNA is then coated with small clusters of silver along the surface, which allows a subsequent layer of gold to be deposited on top of the silver.


Once the molecule is coated in gold, the stable metallized DNA can be used as a mask for a process called plasma lithography. Oxygen plasma, a very reactive “gas flow” of ionized molecules, is used to wear away any unprotected graphene, leaving behind a graphene structure identical to the original DNA shape. The metallized DNA is then washed away with sodium cyanide.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Researchers demonstrate hydrogen atoms on graphene yield a magnetic moment

Researchers demonstrate hydrogen atoms on graphene yield a magnetic moment | Amazing Science |

A team of researchers with members from institutions in Spain, France and Egypt has demonstrated that hydrogen atoms on graphene yield a magnetic moment and furthermore, that such moments can order ferromagnetically over relatively large distances. In their paper published in the journal Science the group describes experiments they carried out in attempting to cause a sheet of graphene to become magnetic, how they found evidence that it was possible using hydrogen atoms, and the ways such a material might be used in industrial applications. Shawna Hollen with the University of New Hampshire, and Jay Gupta with Ohio State University, offer some insights into the work done by the team in the same journal issue with a Perspectives piece—they also outline the hurdles that still need to be overcome before magnetic graphene might be used in real applications.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Dark matter does not contain certain axion-like particles - Stockholm University

Dark matter does not contain certain axion-like particles - Stockholm University | Amazing Science |

Physicists are still struggling with the conundrum of identifying more than 80 percent of the matter in the Universe. One possibility is that it is made up by extremely light particles which weigh less than a billionth of the mass of the electron. These particles are often called axion-like particles (ALPs). Since ALPs are hard to find, the researchers have not yet been able to test different types of ALPs that could be a part of the dark matter.


For the first time the researchers used data from NASA's gamma-ray telescope on the Fermi satellite to study light from the central galaxy of the Perseus galaxy cluster in the hunt for ALPs. The researchers found no traces of ALPs and, for the first time, the observations were sensitive enough to exclude certain types of ALPs (ALPs can only constitute dark matter if they have certain characteristics).


One cannot detect ALPs directly but there is a small chance that they transform into ordinary light and vice versa when travelling through a magnetic field. A research team at Stockholm University used a very bright light source, the central galaxy of the Perseus galaxy cluster, to look for these transformations. The energetic light, so-called gamma radiation, from this galaxy could change its nature to ALPs while traveling through the magnetic field that fills the gas between the galaxies in the cluster.


“The ALPs we have been able to exclude could explain a certain amount of dark matter. What is particularly interesting is that with our analysis we are reaching a sensitivity that we thought could only be obtained with dedicated future experiments on Earth”, says Manuel Meyer, post-doc at the Department of Physics, Stockholm University.


Searches for ALPs with the Fermi telescope will continue. More than 80 percent of the matter in the Universe remains to identify. The mysterious dark matter shows itself only through its gravity, it does neither absorb nor radiate any form of light.


No comment yet.
Scooped by Dr. Stefan Gruenwald!

Genomics for the masses: AstraZeneca launches project to sequence 2 million genomes

Genomics for the masses: AstraZeneca launches project to sequence 2 million genomes | Amazing Science |
Drug company aims to pool genomic and medical data in hunt for rare genetic sequences associated with disease.


One of the world’s largest pharmaceutical companies has launched a massive effort to compile genome sequences and health records from two million people over the next decade. In doing so, AstraZeneca and its collaborators hope to unearth rare genetic sequences that are associated with disease and with responses to treatment.


It’s an unprecedented number of participants for this type of study, says Ruth March, vice-president and head of personalized health care and biomarkers at AstraZeneca, which is headquartered in London. “That’s necessary because we’re going to be looking for very rare differences among individuals.”


To achieve that ambitious goal, AstraZeneca will partner with research institutions including the Wellcome Trust Sanger Institute in Hinxton, UK, and Human Longevity, a biotechnology company founded in San Diego, California, by genomics pioneer Craig Venter. AstraZeneca also expects to draw on data from 500,000 participants in its own clinical trials, and medical samples that it has accrued over the past 15 years.


In doing so, AstraZeneca will be following a burgeoning trend in genetics research. For years, geneticists pursued common variations in human DNA sequences that are linked to complex diseases such as diabetes and heart disease. The approach yielded some important insights, but these common variations often accounted for only a small percentage of the genetic contribution to individual diseases.


Researchers are now increasingly focusing on the contribution of unusual genetic variants to disease. Combinations of these variants can hold the key to an individual's traits, says Venter.


The hunt for important rare variants has led AstraZeneca to partner with the Institute for Molecular Medicine Finland, says Aarno Palotie, who heads the Human Genomics Program there. Finland’s population was geographically isolated until recently, he notes, which makes for a unique genetic make-up. As a result, some variations that are very rare in other populations may be more common in Finland, making them easier to detect and study.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

You can now be identified by your ‘brainprint’ with 100% accuracy

You can now be identified by your ‘brainprint’ with 100% accuracy | Amazing Science |

Binghamton University researchers have developed a biometric identification method called Cognitive Event-RElated Biometric REcognition (CEREBRE) for identifying an individual’s unique “brainprint.” They recorded the brain activity of 50 subjects wearing an electroencephalograph (EEG) headset while looking at selected images from a set of 500 images.


The researchers found that participants’ brains reacted uniquely to each image — enough so that a computer system that analyzed the different reactions was able to identify each volunteer’s “brainprint” with 100 percent accuracy.

In their original brainprint study in 2015, published in Neurocomputing (see ‘Brainprints’ could replace passwords), the research team was able to identify one person out of a group of 32 by that person’s responses, with 97 percent accuracy. That study only used words. Switching to images made a huge difference.


It’s only a three-point difference, but going from 97 to 100 percent makes possible a reliable system for high-security situations, such as “ensuring the person going into the Pentagon or the nuclear launch bay is the right person,” said Assistant Professor of Psychology Sarah Laszlo. “You don’t want to be 97 percent accurate for that, you want to be 100 percent accurate.”

Laszlo says brain biometrics are appealing because they can be cancelled (meaning the person can simple do another EEG session) and cannot be imitated or stolen by malicious means, the way a finger or retina can (as in the movieMinority Report).

“If someone’s fingerprint is stolen, that person can’t just grow a new finger to replace the compromised fingerprint — the fingerprint for that person is compromised forever.


Fingerprints are ‘non-cancellable.’ Brainprints, on the other hand, are potentially cancellable. So, in the unlikely event that attackers were actually able to steal a brainprint from an authorized user, the authorized user could then ‘reset’ their brainprint,” Laszlo explained.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Scientists take next step towards observing quantum physics in real life

Scientists take next step towards observing quantum physics in real life | Amazing Science |

Small objects like electrons and atoms behave according to quantum mechanics, with quantum effects like superposition, entanglement and teleportation. One of the most intriguing questions in modern science is if large objects – like a coffee cup - could also show this behavior. Scientists at the TU Delft have taken the next step towards observing quantum effects at everyday temperatures in large objects. They created a highly reflective membrane, visible to the naked eye, that can vibrate with hardly any energy loss at room temperature. The membrane is a promising candidate to research quantum mechanics in large objects.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Dr. Goulu!

LHC data at your fingertips

LHC data at your fingertips | Amazing Science |

Today the CMS collaboration at CERN released more than 300 terabytes (TB) of high-quality open data. These include more than 100 TB of data from proton collisions at 7 TeV, making up half the data collected at the LHC by the CMS detector in 2011. This release follows a previous one from November 2014, which made available around 27 TB of research data collected in 2010.


The data are available on the CERN Open Data Portal and come in two types. The primary datasets are in the same format used by the collaboration to perform research. The derived datasets, on the other hand, require a lot less computing power and can be readily analyzed by university or high school students.


CMS is also providing the simulated data generated with the same software version that should be used to analyze the primary datasets. Simulations play a crucial role in particle physics research. The data release is accompanied by analysis tools and code examples tailored to the datasets. A virtual machine image based on CernVM, which comes preloaded with the software environment needed to analyze the CMS data, can also be downloaded from the portal.

Via Goulu
Scooped by Dr. Stefan Gruenwald!

Europe plans giant billion-euro quantum technologies project

Europe plans giant billion-euro quantum technologies project | Amazing Science |

The European Commission has quietly announced plans to launch a €1-billion (US$1.13 billion) project to boost a raft of quantum technologies — from secure communication networks to ultra-precise gravity sensors and clocks. 


The initiative, to launch in 2018, will be similar in size, timescale and ambition to two existing European flagships, the decade-long Graphene Flagship and the Human Brain Project, although the exact format has yet to be decided, Nathalie Vandystadt, a commission spokesperson, toldNature. Funding will come from a mixture of sources, including the commission, as well as other European and national funders, she added.


The commission is likely to have a “substantial role” in funding the flagship, says Tommaso Calarco, who leads the Integrated Quantum Science and Technology centre at the Universities of Ulm and Stuttgart in Germany. He co-authored a blueprint behind the initiative, which was published in March, called the Quantum Manifesto. Countries around the world are investing in these technologies, says Calarco. Without such an initiative, Europe risks becoming a second-tier player, he says. “The time is really now or never.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The Atom Without Properties

The Atom Without Properties | Amazing Science |
The microscopic world is governed by the rules of quantum mechanics, where the properties of a particle can be completely undetermined and yet strongly correlated with those of other particles. Physicists from the University of Basel have observed these so-called Bell correlations for the first time between hundreds of atoms. Their findings are published in the scientific journal Science.


Everyday objects possess properties independently of each other and regardless of whether we observe them or not. Einstein famously asked whether the moon still exists if no one is there to look at it; we answer with a resounding yes. This apparent certainty does not exist in the realm of small particles. The location, speed or magnetic moment of an atom can be entirely indeterminate and yet still depend greatly on the measurements of other distant atoms.


With the (false) assumption that atoms possess their properties independently of measurements and independently of each other, a so-called Bell inequality can be derived. If it is violated by the results of an experiment, it follows that the properties of the atoms must be interdependent. This is described as Bell correlations between atoms, which also imply that each atom takes on its properties only at the moment of the measurement. Before the measurement, these properties are not only unknown -- they do not even exist.


A team of researchers led by professors Nicolas Sangouard and Philipp Treutlein from the University of Basel, along with colleagues from Singapore, have now observed these Bell correlations for the first time in a relatively large system, specifically among 480 atoms in a Bose-Einstein condensate. Earlier experiments showed Bell correlations with a maximum of four light particles or 14 atoms. The results mean that these peculiar quantum effects may also play a role in larger systems.


In order to observe Bell correlations in systems consisting of many particles, the researchers first had to develop a new method that does not require measuring each particle individually – which would require a level of control beyond what is currently possible. The team succeeded in this task with the help of a Bell inequality that was only recently discovered. The Basel researchers tested their method in the lab with small clouds of ultracold atoms cooled with laser light down to a few billionths of a degree above absolute zero. The atoms in the cloud constantly collide, causing their magnetic moments to become slowly entangled. When this entanglement reaches a certain magnitude, Bell correlations can be detected. Author Roman Schmied explains: “One would expect that random collisions simply cause disorder. Instead, the quantum-mechanical properties become entangled so strongly that they violate classical statistics.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Skeletal stem cells form the blueprint of the face structure

Skeletal stem cells form the blueprint of the face structure | Amazing Science |

Timing is everything when it comes to the development of the vertebrate face. In a new study published in PLoS Genetics, USC Stem Cell researcher Lindsey Barske from the laboratory of Gage Crump and her colleagues identify the roles of key molecular signals that control this critical timing.


Previous work from the Crump and other labs demonstrated that two types of molecular signals, called Jagged-Notch and Endothelin1 (Edn1), are critical for shaping the face. Loss of these signals results in facial deformities in both zebrafish and humans, revealing these as essential for patterning the faces of all vertebrates.


Using sophisticated genetic, genomic and imaging tools to study zebrafish, the researchers discovered that Jagged-Notch and Edn1 work in tandem to control where and when stem cells turn into facial cartilage. In the lower face, Edn1 signals accelerate cartilage formation early in development. In the upper face, Jagged-Notch signals prevent stem cells from making cartilage until later in development. The authors found that these differences in the timing of stem cells turning into cartilage play a major role in making the upper and lower regions of the face distinct from one another.


"We've shown that the earliest blueprint of the facial skeleton is set up by spatially intersecting signals that control when stem cells turn into cartilage or bone. Logically, therefore, small shifts in the levels of these signals throughout evolution could account for much of the diversity of shapes we see within the skulls of different animals, as well as the wonderful array of facial shapes seen in humans," said Barske, lead author and A.P. Giannini postdoctoral research fellow.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The biggest Big Data project on Earth

The biggest Big Data project on Earth | Amazing Science |
The biggest amount of data ever gathered and processed passing through the UK, for scientists and SMBs to slice, dice, and turn into innovations and insights. When Big Data becomes Super-Massive Data.


Eventually there will be two SKA telescopes. The first, consisting of 130,000 2m dipole low-frequency antennae, is being built in the Shire of Murchison, a remote region about 800km north of Perth, Australia – an area the size of the Netherlands, but with a population of less than 100 people. Construction kicks off in 2018.


By Phase 2, said Diamond, the SKA will consist of half-a-million low and mid-frequency antennae, with arrays spread right across southern Africa as well as Australia, stretching all the way from South Africa to Ghana and Kenya – a multibillion-euro project on an engineering scale similar to the Large Hadron Collider. Which brings us to that supermassive data challenge for what, ultimately, will be an ICT-driven science facility. Diamond says: "The antennae will generate enormous volumes of data: even by the mid-2020s, Phase 1 of the project will be looking at 5,000 petabytes – five exabytes – a day of raw data. This will go to huge banks of digital signal processors, which we’re in the process of designing, and then into high-performance computers, and into an archive for scientists worldwide to access."


Our archive growth rate will be somewhere will be somewhere between 300 and 500 petabytes a year – science-quality data coming out of the supercomputer.


Using the most common element in the universe, neutral hydrogen, as a tracer, the SKA will be able to follow the trail all the way back to the cosmic dawn, a few hundred thousand years after the Big Bang. But over billions of years (a beam of light travelling at 671 million miles an hour would take 46.5 billion years to reach the edge of the observable universe) the wavelength of those ancient hydrogen signatures becomes stretched via the doppler effect, until it falls into the same range as the radiation emitted by mobile phones, aircraft, FM radio, and digital TV. This is why the SKA arrays are being built in remote, sparsely populated regions, says Diamond:

"The aim is to get away from people. It’s not because we’re antisocial – although some of my colleagues probably are a little! – but we need to get away from radio interference, phones, microwaves, and so on, which are like shining a torch in the business end of an optical telescope."


Eventually there will be two SKA telescopes. The first, consisting of 130,000 2m dipole low-frequency antennae, is being built in the Shire of Murchison, a remote region about 800km north of Perth, Australia – an area the size of the Netherlands, but with a population of less than 100 people. Construction kicks off in 2018.


By Phase 2, said Diamond, the SKA will consist of half-a-million low and mid-frequency antennae, with arrays spread right across southern Africa as well as Australia, stretching all the way from South Africa to Ghana and Kenya – a multibillion-euro project on an engineering scale similar to the Large Hadron Collider.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Einstein Ring: Dwarf Dark Galaxy Hidden in ALMA Gravitational Lens Image

Einstein Ring: Dwarf Dark Galaxy Hidden in ALMA Gravitational Lens Image | Amazing Science |

Subtle distortions hidden in ALMA’s stunning image of the gravitational lens SDP.81 are telltale signs that a dwarf dark galaxy is lurking in the halo of a much larger galaxy nearly 4 billion light-years away. This discovery paves the way for ALMA to find many more such objects and could help astronomers address important questions on the nature of dark matter.

In 2014, as part of ALMA’s Long Baseline Campaign, astronomers studied a variety of astronomical objects to test the telescope's new, high-resolution capabilities. One of these experimental images was that of an Einstein ring, which was produced by the gravity of a massive foreground galaxy bending the light emitted by another galaxy nearly 12 billion light-years away.

This phenomenon, called gravitational lensing, was predicted by Einstein’s general theory of relativity and it offers a powerful tool for studying galaxies that are otherwise too distant to observe. It also sheds light on the properties of the nearby lensing galaxy because of the way its gravity distorts and focuses light from more distant objects.

In a new paper accepted for publication in the Astrophysical Journal, astronomer Yashar Hezaveh at Stanford University in California and his team explain how detailed analysis of this widely publicized image uncovered signs of a hidden dwarf dark galaxy in the halo of the more nearby galaxy.

"We can find these invisible objects in the same way that you can see rain droplets on a window. You know they are there because they distort the image of the background objects,” explained Hezaveh. In the case of a rain drop, the image distortions are caused by refraction. In this image, similar distortions are generated by the gravitational influence of dark matter.

Current theories suggest that dark matter, which makes up about 80 percent of the mass of the Universe, is made of as-yet-unidentified particles that don’t interact with visible light or other forms of electromagnetic radiation. Dark matter does, however, have appreciable mass, so it can be identified by its gravitational influence.

For their analysis, the researchers harnessed thousands of computers working in parallel for many weeks, including the National Science Foundation's most powerful supercomputer, Blue Waters, to search for subtle anomalies that had a consistent and measurable counterpart in each "band" of radio data. From these combined computations, the researchers were able to piece together an unprecedented understanding of the lensing galaxy’s halo, the diffuse and predominantly star-free region around the galaxy, and discovered a distinctive clump less than one-thousandth the mass of the Milky Way.

Because of its relationship to the larger galaxy, estimated mass, and lack of an optical counterpart, the astronomers believe this gravitational anomaly may be caused by an extremely faint, dark-matter dominated satellite of the lensing galaxy. According to theoretical predictions, most galaxies should be brimming with similar dwarf galaxies and other companion objects. Detecting them, however, has proven challenging. Even around our own Milky Way, astronomers can identify only 40 or so of the thousands of satellite objects that are predicted to be present.

"This discrepancy between observed satellites and predicted abundances has been a major problem in cosmology for nearly two decades, even called a 'crisis' by some researchers," said Neal Dalal of the University of Illinois, a member of the team. "If these dwarf objects are dominated by dark matter, this could explain the discrepancy while offering new insights into the true nature of dark matter," he added.

Computer models of the evolution of the Universe indicate that by measuring the “clumpiness” of dark matter, it’s possible to measure its temperature. So by counting the number of small dark matter clumps around distant galaxies, astronomers can infer the temperature of dark matter, which has an important bearing on the smoothness of our Universe.

"If these halo objects are simply not there," notes co-author Daniel Marrone of the University of Arizona, "then our current dark matter model cannot be correct and we will have to modify what we think we understand about dark matter particles." 

No comment yet.
Scooped by Dr. Stefan Gruenwald!

How alien can a planet be and still support life?

How alien can a planet be and still support life? | Amazing Science |
Geoscientists imagine the unearthly mechanisms that could keep alien planets habitable.


Just how fantastical a planet can be and still support recognizable life isn’t just a question for science fiction. Astronomers are searching the stars for otherworldly inhabitants, and they need a road map. Which planets are most likely to harbor life? That’s where geoscientists’ imaginations come in. Applying their knowledge of how our world works and what allows life to flourish, they are envisioning what kind of other planetary configurations could sustain thriving biospheres.


You don’t necessarily need an Earth-like planet to support Earth-like life, new research suggests. For decades, thinking about the best way to search for extraterrestrials has centered on a “Goldilocks” zone where temperatures are “just right” for liquid water, a key ingredient for life, to wet the surface of an Earth doppelgänger. But now it’s time to think outside the Goldilocks zone, some scientists say. Unearthly mechanisms could keep greenhouse gas levels in check and warm planets in the coldest outer reaches of a solar system. Life itself could even play a starring role in a planet’s enduring habitability.


“It’s an exciting time,” says Harvard planetary scientist Robin Wordsworth. “There’s still a ton for us to learn about the way different planets behave. The Goldilocks zone is just a very rough guide, and we need to keep an open mind.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Do 'genetic superheroes' exist? Or did media overhype Resilience Project?

Do 'genetic superheroes' exist? Or did media overhype Resilience Project? | Amazing Science |
Genetic Superheroes. Hitting the genetic lottery. 13 Incredibly lucky people. Bulletproof genomes.


That’s just a few of the ways people have described the results from a recent analysis of the genomes of over half a million people which found that 13 lucky people have disease causing mutations, but don’t exhibit any symptoms.


The study is the largest effort, to date, to identify so called ‘resilient’ individuals. These are healthy people who possess a mutation in their genome that is known to be disease causing. Many believe the DNA of these resilient people hold the key to treating genetic diseases, like cystic fibrosis, that today are incurable.


The existence of these 13 genetic Herculeses has created much excitement in the media:

  • STAT: Genetic ‘unicorns’ defy their own DNA — and hint at treatments
  • NPR: How Do ‘Genetic Superheroes’ Overcome Their Bad DNA?
  • BBC‘Superhero DNA’ Keeps Diseases at Bay


But did the study really identify a few lucky winners of the genome lottery? What’s the real story here? The study published in the journal Nature Biotechnology by a team of international scientists led by researchers at Icahn School of Medicine at Mount Sinai in New York City searched the genomes of 589,306 people—all over the age of 30—for 874 genes that are linked to 584 genetic diseases. All of these diseases begin to affect a person during childhood, like cystic fibrosis, Tay-sachs and Pfeiffer syndrome.


The team obtained these sequences from a variety of previous studies, but most of the data—nearly 400,000 samples—came from the at-home, personal genetics test 23andMe. (On the 23andMe consent forms, customers can select a box to allow their DNA to be used in such research.) Pooling all of this data, the scientists identified 15,597 potentially resilient individuals, but after a rigorous screen of these candidates, they eliminated almost all of them, settling on just 13, they believed were resilient.


The study is considered, by the Resilience Project leaders to be a ‘proof of concept’ study which means they modestly set out to prove their methods could identify resilient individuals. The study’s leader, Stephen Friend, says the idea to look for resilient people came out of frustration from the lack of success he had in looking at the problem from the other way. He and other biotech researchers usually search for genetic variants common in a number of sick individuals and then look for ways to fix the defect, but Friend admits they have been largely unsuccessful using this approach. He hopes that by looking for resilient people instead, he will discover why they are resilient and then use that knowledge to treat those who do exhibit symptoms.


But some have begun to question the validity of the resilience of these candidates, which could blow a hole in the conclusions. The study was a retrospective analysis, meaning the authors looked over data from other studies to establish connections, but they did not personally examine any of the participants. More importantly, for many they never can. In several of the studies they borrowed data from, recontact was not even considered when asking for participant consent. For participant’s samples from 23andMe, the consent for recontact falls into a gray area because the company does not specifically ask for permission to recontact on its consent form.


Nature Biotechnology published an independent commentary from Daniel MacArthur, a geneticist who teaches at Harvard University and conducts research at Massachusetts General Hospital. MacArthur explains why this data collection flaw hurts the study’s validity: "Perhaps most unfortunately, the researchers could not recontact the majority of resilient individuals for further study because of a lack of necessary consent forms. This means that some of their resilient cases may be mirages (the result of undisclosed disease cases, sample swaps, or somatic mosaicism), and this lack of consent precluded the collection of further clinical and genetic data to explore possible resilience mechanisms."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Small handheld device tracks disease mutations within minutes

QuantuMDx Group is one of the most exciting biotechs to emerge from the UK and is developing a low cost, simple-to-use, handheld laboratory for 15-minute diagnosis of disease at the patient's side, for commercialisation in 2015. The robust device, which reads and sequences DNA and converts it into binary code using a tiny computer chip, is ideally suited to help address the humanitarian health burden by offering molecular diagnostics at a fraction of the price of traditional testing.


Rapidly & accurately detecting and monitoring emerging drug resistance of infectious diseases such as malaria, TB and HIV will enable health professionals to immediately prescribe the most effective drug against that disease. Once the device has passed regulatory approval, it will be available in developed countries for infectious disease testing and rapid cancer profiling and, in time, be available over-the-counter at pharmacies.


No comment yet.
Rescooped by Dr. Stefan Gruenwald from Geology!

Lightning is not evenly distributed around the world

Lightning is not evenly distributed around the world | Amazing Science |
A map of the world showing where lightning activity is most intense and where lightning rarely occurs.


The distribution of lightning on Earth is far from uniform. The ideal conditions for producing lightning and associated thunderstorms occur where warm, moist air rises and mixes with cold air above. These conditions occur almost daily in many parts of the Earth and rarely in other areas.

NASA has satellites orbiting the Earth with sensors designed to detect lightning. Data from these satellites is transmitted to Earth and used to construct a geographic record of lightning activity over time. The maps on this page are based upon the average yearly count of lightning flashes per unit of area. This data was plotted geographically to create the maps.

Much more lightning occurs over land than over the ocean because daily sunshine heats the land surface faster than the ocean. The heated land surface warms the air above it and that warm air rises to encounter cold air aloft. The interaction between air masses of different temperature stimulates thunderstorms and lightning.

Via Dr. Catherine Russell
No comment yet.