Amazing Science
361.4K views | +48 today
Rescooped by Dr. Stefan Gruenwald from GRAVITATION
onto Amazing Science!

Standard Candle' Supernova Extraordinarily Magnified by Gravitational Lensing

Standard Candle' Supernova Extraordinarily Magnified by Gravitational Lensing | Amazing Science |

A team of researchers at the Kavli IPMU led by Robert Quimby has identified what may prove to be the first ever Type Ia supernova (SNIa) magnified by a strong gravitational lens.


In this work, the 'standard candle' property of Type Ia supernovae is used to directly measure the magnification due to gravitational lensing. This provides the first glimpse of the science that will soon come out of dark matter and dark energy studies derived from deep, wide-field imaging surveys. The supernova, named PS1-10afx, was discovered by the Panoramic Survey Telescope and Rapid Response System 1 (Pan-STARRS1).


PS1-10afx exploded over 9 billion years ago, which places it far further than typical Pan-STARRS1 discoveries. Based on this distance and its relatively bright appearance, the Pan-STARRS1 team concluded that PS1-10afx was intrinsically very luminous.


The inferred luminosity, about 100 billion times greater than our Sun, is comparable to members of a new, rare variety of superluminous supernovae (SLSNe), but that is where the similarities end.


SLSNe typically have blue colors, and their brightness changes relatively slowly with time. PS1-10afx on the other hand was rather red even after correcting for its redshift, and its brightness changed as fast as normal supernovae. There is no known physical model that can explain how a supernova could simultaneously be so luminous, so red, and so fast.

Soon after the findings were announced, Robert Quimby, a postdoctoral researcher at Kavli IPMU, independently analyzed the data. Quimby is an expert in SLSNe and has played a key role in their discovery. He quickly confirmed part, but not all of the conclusions.


PS1-10afx was indeed rather distinct from all known SLSNe, but the data struck Quimby as oddly familiar. He compared the features seen in the spectra of PS1-10afx to known supernova, and, surprisingly, found an excellent match. The spectra of PS1-10afx are almost identical to normal SNIa.


SNIa have a very useful property that has enabled cosmologists to chart the expansion of our Universe over the last several billion years: SNIa have strikingly similar peak luminosities that can be rendered even more standard by correcting for how quickly they brighten and fade (their "light curves").


This property allows astronomers to use SNIa as standard candles to measure distances, as was key to the discovery of the accelerating expansion of the Universe (2011 Nobel Prize in Physics).

Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

20,000+ FREE Online Science and Technology Lectures from Top Universities

20,000+ FREE Online Science and Technology Lectures from Top Universities | Amazing Science |

NOTE: To subscribe to the RSS feed of Amazing Science, copy into the URL field of your browser and click "subscribe".


This newsletter is aggregated from over 1450 news sources:


All my Tweets and Scoop.It! posts sorted and searchable:



You can search through all the articles semantically on my

archived twitter feed


NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen)  and display all the relevant postings SORTED by TOPICS.


You can also type your own query:


e.g., you are looking for articles involving "dna" as a keyword

Or CLICK on the little

FUNNEL symbol at the

 top right of the screen


MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video 

Margarida Sá Costa's curator insight, January 31, 9:55 AM

Lectures are in Playlists and are alphabetically sorted with thumbnail pictures. No fee, no registration required - learn at your own pace. Certificates can be arranged with presenting universities.

Casper Pieters's curator insight, March 9, 7:21 PM

Great resources for online learning just about everything.  All you need is will power and self- discipline.

Siegfried Holle's curator insight, July 4, 8:45 AM

Your knowledge is your strength and power 

Scooped by Dr. Stefan Gruenwald!

New device from Johns-Hopkins yields close-up look at metastasizing cancer cells

New device from Johns-Hopkins yields close-up look at metastasizing cancer cells | Amazing Science |

Engineers at Johns Hopkins Institute for NanoBioTechnology (INBT) have invented a lab device to give cancer researchers an unprecedented microscopic look at metastasis (spread of tumor cells, causing more than 90 percent of cancer-related deaths), with the goal of eventually stopping the spread, described in their paper in the journal Cancer Report.

“There’s still so much we don’t know about exactly how tumor cells migrate through the body, partly because, even using our best imaging technology, we haven’t been able to see precisely how these individual cells move into blood vessels,” said Andrew D. Wong, a Department of Materials Science and Engineering doctoral student and lead author of the journal article. “Our new tool gives us a clearer, close-up look at this process.”

The device replicated these processes in a small transparent chip that incorporates an artificial blood vessel and surrounding tissue material. A nutrient-rich solution flows through the artificial vessel, mimicking the properties of blood.

With this novel lab platform, Wong said, the team was able to record a video of the movement of individual cancer cells as they crawled through a three-dimensional collagen matrix. This material resembles the human tissue that surrounds tumors when cancer cells break away and try to relocate elsewhere in the body.

Wong also created a video (above) of single cancer cells prying and pushing their way through the wall of an artificial vessel lined with human endothelial cells, the same kind that line human blood vessels.

By entering the bloodstream through this process, called “intravasion,” cancer cells are able to hitch a ride to other parts of the body and begin to form deadly new tumors.

The breast cancer cells, inserted individually and in clusters in the tissue near the vessel, are labeled with fluorescent tags, enabling their behavior to be seen, tracked and recorded via a microscopic viewing system.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Fossilized Nuclei and Chromosomes Reveal 180 Million Years of Genomic Stasis in Royal Ferns

Fossilized Nuclei and Chromosomes Reveal 180 Million Years of Genomic Stasis in Royal Ferns | Amazing Science |

It defies belief, but a 180 million year old fern fossil unearthed in Sweden is so exquisitely preserved that it is possible to see its cells dividing. So pristine is the fossil, reported scientists from the Swedish Museum of Natural History in the journal Science in March, that it is possible for them to estimate its genome size from the size of its cell nuclei — and that it has remained substantially unchanged from its living descendants since the early Jurassic.

The ferns were swallowed by a volcanic mudflow called a lahar, in which gas and rocky debris from an eruption mix with water and sediment. After entombment, hot salty water percolated into the coarse sediments around the ferns and acted as a preservative brine that immortalized the hapless plants. Their misfortune was our luck: 180 million years later, we can see details of their macro and micro anatomy so well that we can see how uncannily similar they are to their living descendants, royal and cinnamon ferns. They could be sisters!

Fossils from the family this fern belongs to had already been found from 220 million year-old rocks that were recognizable as the living species Osmunda claytonia — the interrupted fern — and other fossils from the Mesozoic have been found that are virtually indistinguishable from other genera and species in the fern’s family, the Osmundaceae (the royal ferns). But microscopic preservation of this quality has rarely been seen in any fossils before.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Global groundwater crisis may get worse as the world warms

Global groundwater crisis may get worse as the world warms | Amazing Science |

The world is facing an increasingly dire groundwater depletion crisis, according to a NASA researcher. From India to Texas, people are rapidly depleting their valuable stores of groundwater — leading to the possibility that aquifers may be emptied within decades, a NASA researcher has warned.

In a recent commentary in the journal Nature Climate Change, Jay Famiglietti, who has helped lead the use of a NASA satellite system to detect groundwater changes around the world, warned of dramatic consequences to come if changes are not made to the way that societies manage water supplies. “Our overuse of groundwater puts our overall water security at far greater risk than we thought,” Famiglietti says.

Unlike surface water, which is replenished through precipitation, groundwater can take centuries to recharge. Yet humans are depleting groundwater at rates that far exceed the pace at which this water can be replenished.

Think of it this way: groundwater is analogous to a pension, a long-term investment that takes many years to pay off. If you withdraw more than you put in, you'll go bankrupt in the long run. Dams and reservoirs, meanwhile, are more like a checking account.

"Groundwater is being pumped at far greater rates than it can be naturally replenished, so that many of the largest aquifers on most continents are being mined, their precious contents never to be returned," Famiglietti, a researcher at NASA's Jet Propulsion Laboratory in California, wrote.

Famiglietti has used NASA’s Gravity Recovery and Climate Experiment (GRACE) satellite system, which is capable of detecting the most subtle changes in Earth's gravitational field to spot land elevation changes, and thus water depletion, to publish a number of studies on groundwater in recent years. During the summer, for example, he contributed to a study that revealed that water users throughout the Colorado River Basin are tapping into groundwater supplies to make up for the lack of adequate supplies of surface water.

The study found that more than 75% of the water loss in the Colorado River Basin since 2004 came from groundwater. GRACE showed that between December 2004 and November 2013, the Colorado River basin lost nearly 53 million acre feet of freshwater, which is double the total volume of the country’s largest reservoir — Lake Mead in Arizona. More than three-quarters of the total — about 41 million acre feet — was from groundwater.

Katelyn Sesny's curator insight, October 31, 11:41 AM

A lengthy but interesting article. The issue of the "Global Groundwater Crisis" might become a very huge problem in the near future. - UNIT 1

Scooped by Dr. Stefan Gruenwald!

Data smashing: Uncovering lurking order in underlying data

Data smashing: Uncovering lurking order in underlying data | Amazing Science |

From recognizing speech to identifying unusual stars, new discoveries often begin with comparison of data streams to find connections and spot outliers. But simply feeding raw data into a data-analysis algorithm is unlikely to produce meaningful results, say the authors of a new Cornell study. That’s because most data comparison algorithms today have one major weakness: somewhere, they rely on a human expert to specify what aspects of the data are relevant for comparison, and what aspects aren’t.

But these experts can’t keep up with the growing amounts and complexities of big data. So the Cornell computing researchers have come up with a new principle they call “data smashing” for estimating the similarities between streams of arbitrary data without human intervention, and even without access to the data sources.

Data smashing is based on a new way to compare data streams. The process involves two steps.

  1. The data streams are algorithmically “smashed” to “annihilate” the information in each other.
  2. The process measures what information remains after the collision. The more information remains, the less likely the streams originated in the same source.

Data-smashing principles could open the door to understanding increasingly complex observations, especially when experts don’t know what to look for, according to the researchers. The researchers— Hod Lipson, associate professor of mechanical engineering and of computing and information science, and Ishanu Chattopadhyay, a former postdoctoral associate with Lipson now at the University of Chicago — demonstrated this idea with data from real-world problems, including detection of anomalous cardiac activity from heart recordings and classification of astronomical objects from raw photometry.

In all cases and without access to original domain knowledge, the researchers demonstrated that the performance of these general algorithms was on par with the accuracy of specialized algorithms and heuristics tweaked by experts to work.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

World's first photonic pressure sensor outshines traditional mercury standard

World's first photonic pressure sensor outshines traditional mercury standard | Amazing Science |

For almost 400 years, mercury gauges have prevailed as the most accurate way to measure pressure. Now, within weeks of seeing "first light," a novel pressure-sensing device has surpassed the performance of the best mercury-based techniques in resolution, speed, and range at a fraction of the size. The new instrument, called a fixed-length optical cavity (FLOC), works by detecting subtle changes in the wavelength of light passing through a cavity filled with nitrogen gas.

The FLOC system is poised to depose traditional mercury pressure sensors – also called manometers – as the standard used to calibrate commercial equipment, says the interdisciplinary team of NIST researchers who developed the system and will continue to refine it over the next few years. The new design is also a promising candidate for a factory-floor pressure instrument that could be used by a range of industries, including those associated with semiconductor, glass, and aerospace manufacturing.

"We've exceeded the expectations we had three years ago," says Thermodynamic Metrology Group Leader Gregory Strouse. "This device is not only a photonic sensor, it's also a primary standard. It's the first photonic-based primary pressure standard. And it works."

About the size of a travel mug, the FLOC has a resolution of 0.1 mPa (millipascal or thousandths of a pascal), 36 times better than NIST'S official U.S. pressure standard, which is a 3-meter-tall (about 10-foot) column of liquid mercury that extends through the ceiling of the calibration room.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Algal virus found in human throats, effects brain function

Algal virus found in human throats, effects brain function | Amazing Science |

It’s not such a stretch to think that humans can catch the Ebola virus from monkeys and the flu virus from pigs. After all, they are all mammals with fundamentally similar physiologies. But now researchers have discovered that even a virus found in the lowly algae can make mammals its home. The invader doesn’t make people or mice sick, but it does seem to slow specific brain activities.

The virus, called ATCV-1, showed up in human brain tissue several years ago, but at the time researchers could not be sure whether it had entered the tissue before or after the people died. Then, it showed up again in a survey of microbes and viruses in the throats of people with psychiatric disease. Pediatric infectious disease expert Robert Yolken from Johns Hopkins University School of Medicine in Baltimore, Maryland, and his colleagues were trying to see if pathogens play a role in these conditions. At first, they didn't know what ATCV-1 was, but a database search revealed its identity as a virus that typically infects a species of green algae found in lakes and rivers.

The researchers wanted to find out if the virus was in healthy people as well as sick people. They checked for it in 92 healthy people participating in a study of cognitive function and found it in 43% of them. What’s more, those infected with the virus performed 10% worse than uninfected people on tests requiring visual processing. They were slower in drawing a line connecting a sequence of numbers randomly placed on a page, for example. And they seemed to have shorter attention spans, the researchers report online today in the Proceedings of the National Academy of Sciences. The effects were modest, but significant.

The slower brain function was not associated with any differences in sex, income or education level, race, place of birth, or cigarette smoking. But that doesn't necessarily mean the virus causes cognitive decline; it might just benefit from some other factor that impairs the brain in some people, such as other infectious agents, heavy metals, or pollutants, the researchers say.

To test for causality, the team injected uninfected and infected green algae into the mouths of mice. (They could tell that the mice became infected with the virus because they developed antibodies to it.) Infected and uninfected mice underwent a battery of tests. The two groups were about on par with how well they moved, but infected animals took 10% longer to find their way out of mazes and spent 20% less time exploring new objects—indications that they had poorer attention spans and were not as good at remembering their surroundings.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

What hides beneath: Mapping long lost megalopolis underneath tropical rainforest

What hides beneath: Mapping long lost megalopolis underneath tropical rainforest | Amazing Science |

Archaeologists are known for getting their hands—and everything else—dirty. However, in April 2012, archaeologist Damian Evans boarded a helicopter and spent hours being flown over the dense foliage surrounding Angkor Wat, Cambodia’s legendary complex of ancient stone temples. The heat inside the helicopter was intense, reaching upward of 40 °C, but Evans, a faculty member of the University of Sydney based in Cambodia, much preferred flying over the trees than trudging through the vegetation beneath them. The visible, known temples in Angkor were well-trodden and populated with visitors from all over the world. However, the outlying forest beneath Evans’s ride, lush and green from the air, hid land mines left over from Cambodia’s tumultuous past.

In recent years, archaeologists have used this technology, called LiDAR (a portmanteau of “light” and “radar”), to find ruins of structures—roads, canals, temples, reservoirs—long overgrown with vegetation and hidden from easy observation. LiDAR is revolutionizing what scientists think about the size of ancient cities and how ancient civilizations used the land. It has accelerated the pace of surveying nearly impenetrable areas to a rate that would have been unthinkable just a few years ago.

After 20 hours of flying over two days, the helicopter carrying Evans had surveyed 370 square kilometers (91,400 acres) and Evans had as much data about Angkor’s hidden landforms as he might have gathered during his entire career.

“To achieve the same number of data points as we did with LiDAR would have taken decades on the ground,” Evans says. In addition, Evans says he and his colleagues suspected that previous studies of the area were incomplete. “Our concerns were that previous research had missed three-quarters of the downtown metropolitan part of Angkor.”

The urban sprawl around the temples of Angkor had already been identified as the largest integrated settlement complex of the preindustrial world. However, Evans’s LiDAR map confirmed that existing surveys had been vastly underestimating the size of the formally planned street grid in the central area of the city.

In a 2007 PNAS study (1) that combined ground surveys with airborne radar mapping, Evans and his colleagues first found a chaotic, urban sprawl beyond the city walls of the Angkor Wat complex, and the temples were the center of large, urban landscapes.

In a study based on LiDAR maps, published in PNAS in July 2013 (2), Evans et al. reported that the lasers illuminated in exceptional detail that the now-tangled land outside the temple walls had once, 1,000 years ago, been divided into neat rectangles like city blocks, with canals, ponds, and residences. The urban grids found inside the walls had been built outside, too, covering an area of 36 square kilometers.

“It’s relatively easy to draw a line around temples, but the revelation from LiDAR is that you find this web of subtle traces of urban networks,” Evans says.

The new study suggests the Angkor capitals were much more densely populated than was previously believed, offering more evidence to a growing idea that the Khmer civilization grew so large that it couldn’t grow enough crops to keep up. Overpopulation, combined with the lack of sustainable agricultural methods, may have left the cities vulnerable to decades-long droughts that struck in the 14th and 15th centuries, the same time the Khmer kings abandoned Angkor.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Turning loss into gain: Cutting power could dramatically boost laser output

Turning loss into gain: Cutting power could dramatically boost laser output | Amazing Science |
Lasers – devices that deliver beams of highly organized light – are so deeply integrated into modern technology that their basic operations would seem well understood. CD players, medical diagnostics and military surveillance all depend on lasers.

Re-examining longstanding beliefs about the physics of these devices, Princeton engineers have now shown that carefully restricting the delivery of power to certain areas within a laser could boost its output by many orders of magnitude. The finding, published Oct. 26 in the journal Nature Photonics, could allow far more sensitive and energy-efficient lasers, as well as potentially more control over the frequencies and spatial pattern of light emission.

"It's as though you are using loss to your advantage," said graduate student Omer Malik, an author of the study along with Li Ge, now an assistant professor at the City University of New York, and Hakan Tureci, assistant professor of electrical engineering at Princeton. The researchers said that restricting the delivery of power causes much of the physical space within a laser to absorb rather than produce light. In exchange, however, the optimally efficient portion of the laser is freed from competition with less efficient portions and shines forth far more brightly than previous estimates had suggested.

The results, based on mathematical calculations and computer simulations, still need to be verified in experiments with actual lasers, but the researchers said it represents a new understanding of the fundamental processes that govern how lasers produce light.

"Distributing gain and loss within the material is a higher level of design – a new tool – that had not been used very systematically until now," Tureci said.

The heart of a laser is a material that emits light when energy is supplied to it. When a low level of energy is added, the light is "incoherent," essentially meaning that it contains a mix of wavelengths (or colors). As more energy is added, the material suddenly reaches a "lasing" threshold when it emits coherent light of a particular wavelength.

The entire surface of the material does not emit laser light; rather, if the material is arranged as a disc, for example, the light might come from a ring close to the edge. As even more energy is added, more patterns emerge – for example a ring closer to the center might reach the laser threshold. These patterns – called modes – begin to interact and sap energy from each other. Because of this competition, subsequent modes requiring higher energy may never reach their lasing thresholds. However, Tureci's research group found that some of these higher threshold modes were potentially far more efficient than the earlier ones if they could just be allowed to function without competition.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The top 100 papers: NATURE magazine explores the most-cited research papers of all time

The top 100 papers: NATURE magazine explores the most-cited research papers of all time | Amazing Science |

The discovery of high-temperature superconductors, the determination of DNA’s double-helix structure, the first observations that the expansion of the Universe is accelerating — all of these breakthroughs won Nobel prizes and international acclaim. Yet none of the papers that announced them comes anywhere close to ranking among the 100 most highly cited papers of all time.

Citations, in which one paper refers to earlier works, are the standard means by which authors acknowledge the source of their methods, ideas and findings, and are often used as a rough measure of a paper’s importance. Fifty years ago, Eugene Garfield published the Science Citation Index (SCI), the first systematic effort to track citations in the scientific literature. To mark the anniversary, Nature asked Thomson Reuters, which now owns the SCI, to list the 100 most highly cited papers of all time. (See the full list at Web of Science Top 100.xls or the interactive graphic, below.) The search covered all of Thomson Reuter’s Web of Science, an online version of the SCI that also includes databases covering the social sciences, arts and humanities, conference proceedings and some books. It lists papers published from 1900 to the present day.

The exercise revealed some surprises, not least that it takes a staggering 12,119 citations to rank in the top 100 — and that many of the world’s most famous papers do not make the cut. A few that do, such as the first observation1 of carbon nanotubes (number 36) are indeed classic discoveries. But the vast majority describe experimental methods or software that have become essential in their fields.

The most cited work in history, for example, is a 1951 paper2 describing an assay to determine the amount of protein in a solution. It has now gathered more than 305,000 citations — a recognition that always puzzled its lead author, the late US biochemist Oliver Lowry.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

DARPA amplifier circuit achieves speeds of 1 trillion Hz, enters Guinness World Records

DARPA amplifier circuit achieves speeds of 1 trillion Hz, enters Guinness World Records | Amazing Science |

Officials from Guinness World Records have recognized DARPA’s Terahertz Electronics program for creating the fastest solid-state amplifier integrated circuit ever measured: one terahertz (1012 GHz), or one trillion cycles per second — 150 billion cycles faster than the existing world record set in 2012.

“This breakthrough could lead to revolutionary technologies such as high-resolution security imaging systems, improved collision-avoidance radar, communications networks with many times the capacity of current systems, and spectrometers that could detect potentially dangerous chemicals and explosives with much greater sensitivity,” said Dev Palmer, DARPA program manager.

Developed by Northrop Grumman Corporation, the Terahertz Monolithic Integrated Circuit (TMIC) exhibits power gains (amplification) several orders of magnitude beyond the current state of the art by using a super-scaled 25 nanometer gate-length indium phosphide high electron mobility transistor.

The TMIC showed a measured gain (on the logarithmic scale) of nine decibels at 1.0 terahertz and eight decibels at 1.03 terahertz. “Nine decibels of gain is unheard of at terahertz frequencies” said Palmer. “This opens up new possibilities for building terahertz radio circuits.”

By contrast, current smartphone technology operates at one to two gigahertz and wireless networks at 5.7 gigahertz.

For years, researchers have been looking to exploit the high-frequency sub-millimeter-wave spectrum beginning above 300 gigahertz. Current electronics using solid-state technologies have largely been unable to access the sub-millimeter band of the electromagnetic spectrum due to insufficient transistor performance.

To address the “terahertz gap,” engineers have traditionally used frequency conversion—converting alternating current at one frequency to alternating current at another frequency—to multiply circuit operating frequencies up from millimeter-wave frequencies.

This approach, however, restricts the output power of electrical devices and adversely affects signal-to-noise ratio. Frequency conversion also increases device size, weight and power supply requirements.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Making designer mutants in all kinds of model organisms

Making designer mutants in all kinds of model organisms | Amazing Science |

Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.

Modern developmental biology was born out of the fruitful marriage between traditional embryology and genetics. Genetic tools, together with advanced microscopy techniques, serve as the most fundamental means for developmental biologists to elucidate the logistics and the molecular control of growth, differentiation and morphogenesis. For this reason, model organisms with sophisticated and comprehensive genetic tools have been highly favored for developmental studies. Advances made in developmental biology using these genetically amenable models have been well recognized. The Nobel prize in Physiology or Medicine was awarded in 1995 to Edward B. Lewis, Christiane Nüsslein-Volhard and Eric F. Wieschaus for their discoveries on the ‘Genetic control of early structural development’ usingDrosophila melanogaster, and again in 2002 to John Sulston, Robert Horvitz and Sydney Brenner for their discoveries of ‘Genetic regulation of development and programmed cell death’ using the nematode worm Caenorhabditis elegans. These fly and worm systems remain powerful and popular models for invertebrate development studies, while zebrafish (Danio rerio), the dual frog species Xenopus laevis and Xenopus tropicalis, rat (Rattus norvegicus), and particularly mouse (Mus musculus) represent the most commonly used vertebrate model systems. To date, random or semi-random mutagenesis (‘forward genetic’) approaches have been extraordinarily successful at advancing the use of these model organisms in developmental studies. With the advent of reference genomic data, however, sequence-specific genomic engineering tools (‘reverse genetics’) enable targeted manipulation of the genome and thus allow previously untestable hypotheses of gene function to be addressed.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Sequenced genomes reveal mutations that disable single genes and can help to identify new drugs

Sequenced genomes reveal mutations that disable single genes and can help to identify new drugs | Amazing Science |
On average, every person carries mutations that inactivate at least one copy of 200 or so genes and both copies of around 20 genes. However, knockout mutations in any particular gene are rare, so very large populations are needed to study their effects. These ‘loss of function’ mutations have long been implicated in certain debilitating diseases, such as cystic fibrosis. Most, however, seem to be harmless — and some are even beneficial to the persons carrying them. “These are people we’re not going to find in a clinic, but they’re still really informative in biology,” says MacArthur.

His group and others had been focusing on genome data, but they are now also starting to mine patient-health records to determine the — sometimes subtle — effects of the mutations. In a study of more than 36,000 Finnish people, published in July (E. T. Lim et al. PLoS Genet. 10, e1004494; 2014), MacArthur and his team discovered that people lacking a gene called LPA might be protected from heart disease, and that another knockout mutation, carried in one copy of a gene by up to 2.4% of Finns, may cause fetuses to miscarry if it is present in both copies.

Bing Yu of the University of Texas Health Science Center in Houston told the meeting how he and his collaborators had compared knockout mutations found in more than 1,300 people with measurements of around 300 molecules in their blood. The team found that mutations in one gene, called SLCO1B1, were linked to high levels of fatty acids, a known risk factor for heart failure. And a team from the Wellcome Trust Sanger Institute in Hinxton, UK, reported that 43 genes whose inactivation is lethal to mice were found to be inactivated in humans who are alive and apparently well.

The poster child for human-knockout efforts is a new class of drugs that block a gene known as PCSK9 (see Nature 496, 152–155; 2013). The gene was discovered in French families with extremely high cholesterol levels in the early 2000s. But researchers soon found that people with rare mutations that inactivate one copy of PCSK9 have low cholesterol and rarely develop heart disease. The first PCSK9-blocking drugs should hit pharmacies next year, with manufacturers jostling for a share of a market that could reach US$25 billion in five years.

“I think there are hundreds more stories like PCSK9 out there, maybe even thousands,” in which a drug can mimic an advantageous loss-of-function mutation, says Eric Topol, director of the Scripps Translational Science Institute in La Jolla, California. Mark Gerstein, a bio­informatician at Yale University in New Haven, Connecticut, predicts that human knockouts will be especially useful for identifying drugs that treat diseases of ageing. “You could imagine there’s a gene that is beneficial to you as a 25-year-old, but the thing is not doing a good job for you when you’re 75.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Oxygen levels on ancient Earth were key to animal evolution

Oxygen levels on ancient Earth were key to animal evolution | Amazing Science |

tmospheric oxygen levels during the billion years or so prior to the rise of animals were far too low for complex life forms to develop, according to a new study. The findings, reported in the journal Science, imply that the appearance of diverse animal life on Earth about 800 million years ago, was triggered by increases in oxygen levels - and not just genetic innovations in individual organisms.

"No one really doubted that oxygen levels were low, but how low is the real surprise," says one of the study's authors Dr Peter McGoldrick of the University of Tasmania"Our work shows those levels were just 0.1 per cent of present atmospheric levels, which is significant from an evolutionary point of view because biologists believe that complex multicellular life forms require much more oxygen than 0.1 per cent."

This is the first time anyone has been able to quantify the levels of oxygen in the atmosphere during the mid-Proterozoic period between 0.8 and 1.8 billion years ago, he says.

McGoldrick describes this period in Earth's history as the 'boring billion', when life remained largely constant and unchanging between the appearance of complex cells around 2 billion years ago, and the sudden diversification of multicellular animals about 800 million years ago.

Scientists already knew that oxygen began to accumulate in the atmosphere after cyanobacteria began using photosynthesis to produce oxygen over three billion years ago. So they wondered why animal species didn't flourish during the boring billion year stretch leading up to the end of the Proterozoic, when most researchers thought there was plenty of oxygen.

"We knew oxygen levels had gone up over all, but we didn't know if it had gone up to 1, 10 or 40 per cent of present atmospheric levels," says McGoldrick. "This explains why complex animals don't appear in the rock record until maybe 750 to 800 million years ago, there simply wasn't enough oxygen for the metabolic things they need to do."

Oxygen levels in the atmosphere were determined by examining chromium isotope ratios in ironstone samples. This provided information on oxygen levels for the billion or so years leading up to the 'Cambrian explosion' - when most major animal groups appeared on the planet.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New brain decoder algorithm can eavesdrop on your inner voice

New brain decoder algorithm can eavesdrop on your inner voice | Amazing Science |

As you read this, your neurons are firing – that brain activity can now be decoded to reveal the silent words in your head. TALKING to yourself used to be a strictly private pastime. That's no longer the case – researchers have eavesdropped on our internal monologue for the first time. The achievement is a step towards helping people who cannot physically speak communicate with the outside world.

"If you're reading text in a newspaper or a book, you hear a voice in your own head," says Brian Pasley at the University of California, Berkeley. "We're trying to decode the brain activity related to that voice to create a medical prosthesis that can allow someone who is paralysed or locked in to speak."

When you hear someone speak, sound waves activate sensory neurons in your inner ear. These neurons pass information to areas of the brain where different aspects of the sound are extracted and interpreted as words.

In a previous study, Pasley and his colleagues recorded brain activity in people who already had electrodes implanted in their brain to treat epilepsy, while they listened to speech. The team found that certain neurons in the brain's temporal lobe were only active in response to certain aspects of sound, such as a specific frequency. One set of neurons might only react to sound waves that had a frequency of 1000 hertz, for example, while another set only cares about those at 2000 hertz. Armed with this knowledge, the team built an algorithm that could decode the words heard based on neural activity alone

 (PLoS Biology,

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Genetic factors behind surviving or dying from Ebola shown in mouse study

Genetic factors behind surviving or dying from Ebola shown in mouse study | Amazing Science |

A newly developed mouse model suggests that genetic factors are behind the mild-to-deadly range of responses to the Ebola virus. The frequency of different manifestations of the disease across the lines of these mice are similar in variety and proportion to the spectrum of clinical disease observed in the 2014 West African outbreak. The new mouse model might be useful in testing candidate therapeutics and vaccines for Ebola, and in finding genetic markers for susceptibility and resistance to the disease.

Research on Ebola prevention and treatment has been hindered by the lack of a mouse model that replicates the main characteristics of human Ebola hemorrhagic fever. The researchers had originally obtained this genetically diverse group of inbred laboratory mice to study locations on mouse genomes associated with influenza severity.

The research was conducted in a highly secure, state-of-the-art biocontainment safety level 4 laboratory in Hamilton, Mont. The scientists examined mice that they infected with a mouse form of the same species of Ebola virus causing the 2014 West Africa outbreak. The study was done in full compliance with federal, state, and local safety and biosecurity regulations. This type of virus has been used several times before in research studies. Nothing was done to change the virus.

Interestingly, conventional laboratory mice previously infected with this virus died, but did not develop symptoms of Ebola hemorrhagic fever.

In the present study, all the mice lost weight in the first few days after infection. Nineteen percent of the mice were unfazed. They not only survived, but also fully regained their lost weight within two weeks. They had no gross pathological evidence of disease. Their livers looked normal. Eleven percent were partially resistant and less than half of these died. Seventy percent of the mice had a greater than 50 percent mortality. Nineteen percent of this last group had liver inflammation without classic symptoms of Ebola, and thirty-four percent had blood that took too long to clot, a hallmark of fatal Ebola hemorrhagic fever in humans. Those mice also had internal bleeding, swollen spleens and changes in liver color and texture.

The scientists correlated disease outcomes and variations in mortality rates to specific genetic lines of mice.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

See me here, see me there: A quantum world arising from many ordinary ones

See me here, see me there: A quantum world arising from many ordinary ones | Amazing Science |

The bizarre behavior of the quantum world — with objects existing in two places simultaneously and light behaving as either waves or particles — could result from interactions between many 'parallel' everyday worlds, a new theory suggests.

“It is a fundamental shift from previous quantum interpretations,” says Howard Wiseman, a theoretical quantum physicist at Griffith University in Brisbane, Australia, who together with his colleagues describes the idea in Physical Review X1.

Theorists have tried to explain quantum behavior through various mathematical frameworks. One of the older interpretations envisages the classical world as stemming from the existence of many simultaneous quantum ones. But that ‘many worlds’ approach, pioneered by the US theorist Hugh Everett III in the 1950s, relies on the worlds branching out independently from one another, and not interacting at all (see 'Many worlds: See me here, see me there').

By contrast, Wiseman’s team envisages many worlds bumping into one another, calling it the 'many interacting worlds' approach. On its own, each world is ruled by classical Newtonian physics. But together, the interacting motion of these worlds gives rise to phenomena that physicists typically ascribe to the quantum world.

The authors work through the mathematics of how that interaction could produce quantum phenomena. For instance, one well-known example of quantum behaviour is when particles are able to tunnel through an energetic barrier that in a classical world they would not be able to overcome on their own. Wiseman says that, in his scenario, as two classical worlds approach an energetic barrier from either side, one of them will increase in speed while the other will bounce back. The leading world will thus pop through the seemingly insurmountable barrier, just as particles do in quantum tunneling.

But much work remains. “By no means have we answered all the questions that such a shift entails,” says Wiseman. Among other things, he and his collaborators have yet to overcome challenges such as explaining how their many-interacting-worlds theory could explain quantum entanglement, a phenomenon in which particles separated by a distance are still linked in terms of their properties.

Carlos Garcia Pando's comment, October 31, 5:25 AM
I think entanglement is a consequence of two simple universes perfectly matching in one particle. What we see is not two entangled particles but one particle that belongs to two very close universes. Close in a different sense, not spatial proximity as we know it, but close enough to share at least one particle in all its observable attributes but space position.
Kirsty Foster's curator insight, October 31, 9:24 AM


Vloasis's curator insight, October 31, 2:56 PM

Much to ponder.

Scooped by Dr. Stefan Gruenwald!

IBM Watson uses cognitive technologies to help finding new sources of oil

IBM Watson uses cognitive technologies to help finding new sources of oil | Amazing Science |

Scientists at IBM and Repsol SA, Spain largest energy company, announced today (Oct. 30) the world’s first research collaboration using cognitive technologies like IBM’s Watson to jointly develop and apply new tools to make it cheaper and easier to find new oil fields.

An engineer will typically have to manually read through an enormous set of journal papers and baseline reports with models of reservoir, well, facilities, production, export, and seismic imaging data.

IBM says its cognitive technologies could help by analyzing hundreds of thousands of papers, prioritize data, and link that data to the specific decision at hand. It will introduce “new real-time factors to be considered, such as current news events around economic instability, political unrest, and natural disasters.”

The oil and gas industry boasts some of the most advanced geological, geophysical and chemical science in the world. But the challenge is to integrate critical geopolitical, economic, and other global news into decisions. And that will require a whole new approach to computing that can speed access to business insights, enhance strategic decision-making, and drive productivity, IBM says.

This goes beyond the capabilities of Watson. But scientists at IBM’s Cognitive Environments Laboratory (CEL), collaborating with Repsol, plan to develop and apply new prototype cognitive tools for real-world use cases in the oil and gas industry. They will experiment with a combination of traditional and new interfaces based upon spoken dialog, gesture, robotics and advanced visualization and navigation techniques.

The objective is build conceptual and geological models, highlight the impact of the potential risks and uncertainty, visualize trade-offs, and explore what-if scenarios to ensure the best decision is made, IBM says.

Repsol is making an initial investment of $15 million to $20 million to develop two applications targeted for next year, Repsol’s director for exploration and production technology Santiago Quesada explained to Bloomberg Business Week. “One app will be used for oil exploration and the other to help determine the most attractive oil and gas assets to buy.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Giant tortoises regain foothold on a Galapagos island - from 15 to over 1,000 breeding on their own again

Giant tortoises regain foothold on a Galapagos island - from 15 to over 1,000  breeding on their own again | Amazing Science |

An endangered population of giant tortoises has recovered on the Galapagos island of Espanola.

Some 40 years after the first captive-bred tortoises were reintroduced to the island by the Galapagos National Park Service, the endemic Española giant tortoises are reproducing and restoring some of the ecological damage caused by feral goats that were brought to the island in the late 19th century.

"The global population was down to just 15 tortoises by the 1960s. Now there are some 1,000 tortoises breeding on their own. The population is secure. It's a rare example of how biologists and managers can collaborate to recover a species from the brink of extinction, " said James P. Gibbs, a professor of vertebrate conservation biology at the SUNY College of Environmental Science and Forestry (ESF) and lead author of the paper published in the journal PLOS ONE.

Gibbs and his collaborators assessed the tortoise population using 40 years of data from tortoises marked and recaptured repeatedly for measurement and monitoring by members of the Galapagos National Park Service, Charles Darwin Foundation, and visiting scientists.

But there is another side to the success story: while the tortoise population is stable, it is not likely to increase until more of the landscape recovers from the damage inflicted by the now-eradicated goats. After the goats devoured all the grassy vegetation and were subsequently removed from the island, more shrubs and small trees have grown on Española. This hinders both the growth of cactus, which is a vital piece of a tortoise's diet, and the tortoises' movement. Chemical analysis of the soil, done by Dr. Mark Teece, an ESF chemistry professor, shows there has been a pronounced shift from grasses to woody plants on the island in the last 100 years.


  1. James P. Gibbs, Elizabeth A. Hunter, Kevin T. Shoemaker, Washington H. Tapia, Linda J. Cayot. Demographic Outcomes and Ecosystem Implications of Giant Tortoise Reintroduction to Española Island, GalapagosPLoS ONE, 2014; 9 (10): e110742 DOI: 10.1371/journal.pone.0110742

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Electron wave function is split by tunnelling into different regions

Electron wave function is split by tunnelling into different regions | Amazing Science |

New research by physicists from Brown University puts the profound strangeness of quantum mechanics in a nutshell—or, more accurately, in a helium bubble.

Experiments led by Humphrey Maris, professor of physics at Brown, suggest that the quantum state of an electron—the electron's wave function—can be shattered into pieces and those pieces can be trapped in tiny bubbles of liquid helium. To be clear, the researchers are not saying that the electron can be broken apart. Electrons are elementary particles, indivisible and unbreakable. But what the researchers are saying is in some ways more bizarre.

In quantum mechanics, particles do not have a distinct position in space. Instead, they exist as a wave function, a probability distribution that includes all the possible locations where a particle might be found. Maris and his colleagues are suggesting that parts of that distribution can be separated and cordoned off from each other.

"We are trapping the chance of finding the electron, not pieces of the electron," Maris said. "It's a little like a lottery. When lottery tickets are sold, everyone who buys a ticket gets a piece of paper. So all these people are holding a chance and you can consider that the chances are spread all over the place. But there is only one prize—one electron—and where that prize will go is determined later."

If Maris's interpretation of his experimental findings is correct, it raises profound questions about the measurement process in quantum mechanics. In the traditional formulation of quantum mechanics, when a particle is measured—meaning it is found to be in one particular location—the wave function is said to collapse.

"The experiments we have performed indicate that the mere interaction of an electron with some larger physical system, such as a bath of liquid helium, does not constitute a measurement," Maris said. "The question then is: What does?"

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Isotope effect produces new type of chemical bond - the vibrational muonium bond

Isotope effect produces new type of chemical bond - the vibrational muonium bond | Amazing Science |

Researchers believe they have confirmed the existence of a new type of chemical bond, first proposed some 30 years ago but never convincingly demonstrated because of the lack of experimental evidence and the relatively poor accuracy of the quantum chemistry methods that prevailed at the time.1 The new work also shows how substituting isotopes can result in fundamental changes in the nature of chemical bonding.

In the early 1980s it was proposed that in certain transition states consisting of a very light atom sandwiched between two heavy ones, the system would be stabilised not by conventional van der Waal’s forces, but by vibrational bonding, with the light atom shuttling between its two neighbours. However, despite several groups searching for such a system none was demonstrated and the hunt fizzled out.

Now, Jörn Manz, of the Free University of Berlin and Shanxi University in China, and colleagues believe they have the theoretical and experimental evidence to demonstrate a stable vibrational bond.

The researchers carried out a series of theoretical experiments looking at the reaction of BrH with Br to create the radical BrHBr, but using different isotopes of hydrogen. By using muons – elementary particles that are similar to an electron but have greater mass – the team added a range of hydrogen isotopes to BrHBr from the relatively hefty muonic helium4H, to the extremely light muonium, Mu, with a mass nearly 40 times smaller than 4H.

The team mapped two key parameters: the potential energy surface of the system – the three-dimensional potential energy ‘landscape’ relating the energy of the surface, with hills and valleys – to the geometry; and a quantum mechanical parameter, the vibrational zero point energy or ZPE.

Classically, a bond will form if there is a net reduction in the potential energy of the system. However, in certain circumstances, if there is a sufficiently large decrease in the vibrational ZPE, this can overcome the need for a decrease in potential energy and the system can be stabilised by a vibrational bond.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New solar power material converts 90 percent of captured light into heat

New solar power material converts 90 percent of captured light into heat | Amazing Science |

A multidisciplinary engineering team at the University of California, San Diego developed a new nanoparticle-based material for concentrating solar power plants designed to absorb and convert to heat more than 90 percent of the sunlight it captures. The new material can also withstand temperatures greater than 700 degrees Celsius and survive many years outdoors in spite of exposure to air and humidity. Their work, funded by the U.S. Department of Energy's SunShot program, was published recently in two separate articles in the journal Nano Energy.

By contrast, current solar absorber material functions at lower temperatures and needs to be overhauled almost every year for high temperature operations. "We wanted to create a material that absorbs sunlight that doesn't let any of it escape. We want the black hole of sunlight," said Sungho Jin, a professor in the department of Mechanical and Aerospace Engineering at UC San Diego Jacobs School of Engineering. Jin, along with professor Zhaowei Liu of the department of Electrical and Computer Engineering, and Mechanical Engineering professor Renkun Chen, developed the Silicon boride-coated nanoshell material. They are all experts in functional materials engineering.

The novel material features a "multiscale" surface created by using particles of many sizes ranging from 10 nanometers to 10 micrometers. The multiscale structures can trap and absorb light which contributes to the material's high efficiency when operated at higher temperatures.

Concentrating solar power (CSP) is an emerging alternative clean energy market that produces approximately 3.5 gigawatts worth of power at power plants around the globe—enough to power more than 2 million homes, with additional construction in progress to provide as much as 20 gigawatts of power in coming years. One of the technology's attractions is that it can be used to retrofit existing power plants that use coal or fossil fuels because it uses the same process to generate electricity from steam.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Google X project plans to use magnetic nanoparticles and wearable sensor to detect diseases

Google X project plans to use magnetic nanoparticles and wearable sensor to detect diseases | Amazing Science |

Google announced a new “Nanoparticle Platform” project Tuesday to develop medical diagnostic technology using nanoparticles, Andrew Conrad, head of the Google X Life Sciences team, disclosed at The Wall Street Journal’s WSJD Live conference. The idea is to use nanoparticles with magnetic cores circulating in the bloodstream with recognition molecules to detect cancer, plaques, or too much sodium, for example.

There are a number of similar research projects using magnetic (and other) nanoparticles in progress, as reported onKurzweilAI. What’s new in the Google project is delivering nanoparticles to the bloodstream via a pill and using a wearable wrist detector to detect the nanoparticles’ magnetic field and read out diagnostic results.

But this is an ambitious moonshot project. “Google is at least five to seven years away from a product approved for use by doctors,” said Sam Gambhir, chairman of radiology at Stanford University Medical School, who has been advising Dr. Conrad on the project for more than a year, the WSJ reports.

“Even if Google can make the system work, it wouldn’t immediately be clear how to interpret the results. That is why Dr. Conrad’s team started the Baseline study [see “New Google X Project to look for disease and health patterns in collected data”], which he hopes will create a benchmark for comparisons.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Lights out: study shows urgent need to address instability of world's power grid

Lights out: study shows urgent need to address instability of world's power grid | Amazing Science |

Research by Hugh Byrd, Professor of Architecture at the University of Lincoln, UK, and Steve Matthewman, Associate Professor of Sociology at the University of Auckland, New Zealand, highlights the insecurities of power systems and weakening electrical infrastructure across the globe, particularly in built-up urban areas.

The work builds on previous studies which examined a sharp increase in electrical usage over recent years, and warned the world to prepare for the prospect of coping without electricity as instances of complete power failure become increasingly common.

Professor Byrd explained: “We have previously highlighted that demand for new technology continues to grow at an unprecedented rate. Our new research emphasizes why energy sources are becoming increasingly inadequate, and simply cannot continue to meet this demand.

“Throughout our study, we observed a number of network failures due to inadequate energy, whether through depletion of resources such as oil and coal, or through the vagaries of the climate in the creation of renewable energy.”

The British energy regulator Ofgem has predicted a fall in spare electrical power production capacity to two per cent by 2015, meaning there is now even less flexibility of supply to adjust to spikes in demand. 

The issue of energy security exists for countries which have access to significant renewable power supplies too. With rain, wind and sunshine becoming less predictable due to changes brought about by global warming, the new research found that severe blackouts in Kenya, India, Tanzania and Venezuela, which all occurred during the last decade, were caused by shortages of rain in hyrdro-dams.

Further to the irregularities involved in renewable power generation, the study concludes that worldwide electricity supply will also become increasingly precarious due to industry privatization and neglect of infrastructure.

Professor Matthewman said: “Over the past two decades, deregulation and privatization have become major global trends within the electrical power industry. In a competitive environment, reliability and profits may be at cross-purposes — single corporations can put their own interests ahead of the shared grid, and spare capacity is reduced in the name of cost saving. There is broad consensus among energy specialists, national advisory bodies, the reinsurance industry, and organizational sociologists that this has exacerbated blackout risk.”

These trends have seen the separation of power generation, transmission and distribution services – a process which Professors Byrd and Matthewman suggest only opens up more opportunity for electrical disruption. Their study reveals the difficulties that arise when different technical and human systems need to communicate, and points to a breakdown in this type of communication as the main  cause behind two of the world’s worst ever blackouts – from Ohio, USA, to Ontario, Canada, in 2003; and across Italy and neighboring nations in the same year. Together, these power failures affected more than 100 million people.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

San Diego company develops 10-minute $10 Ebola test

San Diego company develops 10-minute $10 Ebola test | Amazing Science |

With a single prick and a single drop of blood, a San Diego company claims they can now detect if a patient has Ebola in less than 10 minutes. The breakthrough technology is called “Ebola Plus,” a tool that can be used to detect Ebola on anyone, anywhere in the world.

“We can do that for a large number of tests simultaneously with just one drop of blood,” said Dr. Cary Gunn, Ph.D. and CEO and Genalyte. Once blood is drawn, a silicon chip is used to detect the virus as blood flows over it.

Researchers at Genalyte have been working on the diagnostic tool for seven years, using it to test for various diseases, and only recently discovered it could also work to spot Ebola. “It allows you to screen more patients more rapidly. The biggest question right now is the debate about quarantine.

Instead of asking people to take their fever once or twice a day, they can just take a prick of blood,” said Dr. Gunn. It can analyze up to 100 samples per hour, and be administered anywhere including, hospitals, airports, and even remote areas in West Africa where the disease is spreading rapidly. “Right now, most people in Liberia aren’t even being tested. People who have suspicion of having Ebola are being checked into wards. The ability to take a prick of blood and do the test would be a game changer in that environment,” said Gunn.

Developing the platform for the test cost Genalyte around $100,000, but each chip that will be used during the tests costs $10 each – making early detection cheaper and easier for caretakers. Currently, the FDA has only approved for P-C-R that can take two hours for results, compared to the Ebola Plus that can provide results in ten minutes.

Scooped by Dr. Stefan Gruenwald!

Rapid Evolution of Anole Populations in Real Time

Rapid Evolution of Anole Populations in Real Time | Amazing Science |

On islands off the coast of Florida, scientists uncover swift adaptive changes among Carolina anole populations, whose habitats were disturbed by the introduction of another lizard species.

For most of its existence, the Carolina anole (Anolis carolinensis) was the only lizard in the southwestern U.S. It could perch where it wanted, eat what it liked. But in the 1970s, aided by human pet trade, the brown anole (Anolis sagrei)—native to Cuba and the Bahamas—came marching in. In experiments on islands off the coast of Florida, scientists studying the effects of the species mixing witnessed evolution in action: the Carolina anole started perching higher up in trees, and its toe pads changed to enable better grip—all in a matter of 15 years, or about 20 lizard generations.

In a paper published in Science today (October 23),Yoel Stuart of the University of Texas at Austin, Todd Campbell from the University of Tampa, Florida, and their colleagues discuss what happened when the two species converged upon the same habitats.

No comment yet.