Amazing Science
678.2K views | +452 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Data smashing: Uncovering lurking order in underlying data

Data smashing: Uncovering lurking order in underlying data | Amazing Science |

From recognizing speech to identifying unusual stars, new discoveries often begin with comparison of data streams to find connections and spot outliers. But simply feeding raw data into a data-analysis algorithm is unlikely to produce meaningful results, say the authors of a new Cornell study. That’s because most data comparison algorithms today have one major weakness: somewhere, they rely on a human expert to specify what aspects of the data are relevant for comparison, and what aspects aren’t.

But these experts can’t keep up with the growing amounts and complexities of big data. So the Cornell computing researchers have come up with a new principle they call “data smashing” for estimating the similarities between streams of arbitrary data without human intervention, and even without access to the data sources.

Data smashing is based on a new way to compare data streams. The process involves two steps.

  1. The data streams are algorithmically “smashed” to “annihilate” the information in each other.
  2. The process measures what information remains after the collision. The more information remains, the less likely the streams originated in the same source.

Data-smashing principles could open the door to understanding increasingly complex observations, especially when experts don’t know what to look for, according to the researchers. The researchers— Hod Lipson, associate professor of mechanical engineering and of computing and information science, and Ishanu Chattopadhyay, a former postdoctoral associate with Lipson now at the University of Chicago — demonstrated this idea with data from real-world problems, including detection of anomalous cardiac activity from heart recordings and classification of astronomical objects from raw photometry.

In all cases and without access to original domain knowledge, the researchers demonstrated that the performance of these general algorithms was on par with the accuracy of specialized algorithms and heuristics tweaked by experts to work.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

World's first photonic pressure sensor outshines traditional mercury standard

World's first photonic pressure sensor outshines traditional mercury standard | Amazing Science |

For almost 400 years, mercury gauges have prevailed as the most accurate way to measure pressure. Now, within weeks of seeing "first light," a novel pressure-sensing device has surpassed the performance of the best mercury-based techniques in resolution, speed, and range at a fraction of the size. The new instrument, called a fixed-length optical cavity (FLOC), works by detecting subtle changes in the wavelength of light passing through a cavity filled with nitrogen gas.

The FLOC system is poised to depose traditional mercury pressure sensors – also called manometers – as the standard used to calibrate commercial equipment, says the interdisciplinary team of NIST researchers who developed the system and will continue to refine it over the next few years. The new design is also a promising candidate for a factory-floor pressure instrument that could be used by a range of industries, including those associated with semiconductor, glass, and aerospace manufacturing.

"We've exceeded the expectations we had three years ago," says Thermodynamic Metrology Group Leader Gregory Strouse. "This device is not only a photonic sensor, it's also a primary standard. It's the first photonic-based primary pressure standard. And it works."

About the size of a travel mug, the FLOC has a resolution of 0.1 mPa (millipascal or thousandths of a pascal), 36 times better than NIST'S official U.S. pressure standard, which is a 3-meter-tall (about 10-foot) column of liquid mercury that extends through the ceiling of the calibration room.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Algal virus found in human throats, effects brain function

Algal virus found in human throats, effects brain function | Amazing Science |

It’s not such a stretch to think that humans can catch the Ebola virus from monkeys and the flu virus from pigs. After all, they are all mammals with fundamentally similar physiologies. But now researchers have discovered that even a virus found in the lowly algae can make mammals its home. The invader doesn’t make people or mice sick, but it does seem to slow specific brain activities.

The virus, called ATCV-1, showed up in human brain tissue several years ago, but at the time researchers could not be sure whether it had entered the tissue before or after the people died. Then, it showed up again in a survey of microbes and viruses in the throats of people with psychiatric disease. Pediatric infectious disease expert Robert Yolken from Johns Hopkins University School of Medicine in Baltimore, Maryland, and his colleagues were trying to see if pathogens play a role in these conditions. At first, they didn't know what ATCV-1 was, but a database search revealed its identity as a virus that typically infects a species of green algae found in lakes and rivers.

The researchers wanted to find out if the virus was in healthy people as well as sick people. They checked for it in 92 healthy people participating in a study of cognitive function and found it in 43% of them. What’s more, those infected with the virus performed 10% worse than uninfected people on tests requiring visual processing. They were slower in drawing a line connecting a sequence of numbers randomly placed on a page, for example. And they seemed to have shorter attention spans, the researchers report online today in the Proceedings of the National Academy of Sciences. The effects were modest, but significant.

The slower brain function was not associated with any differences in sex, income or education level, race, place of birth, or cigarette smoking. But that doesn't necessarily mean the virus causes cognitive decline; it might just benefit from some other factor that impairs the brain in some people, such as other infectious agents, heavy metals, or pollutants, the researchers say.

To test for causality, the team injected uninfected and infected green algae into the mouths of mice. (They could tell that the mice became infected with the virus because they developed antibodies to it.) Infected and uninfected mice underwent a battery of tests. The two groups were about on par with how well they moved, but infected animals took 10% longer to find their way out of mazes and spent 20% less time exploring new objects—indications that they had poorer attention spans and were not as good at remembering their surroundings.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

What hides beneath: Mapping long lost megalopolis underneath tropical rainforest

What hides beneath: Mapping long lost megalopolis underneath tropical rainforest | Amazing Science |

Archaeologists are known for getting their hands—and everything else—dirty. However, in April 2012, archaeologist Damian Evans boarded a helicopter and spent hours being flown over the dense foliage surrounding Angkor Wat, Cambodia’s legendary complex of ancient stone temples. The heat inside the helicopter was intense, reaching upward of 40 °C, but Evans, a faculty member of the University of Sydney based in Cambodia, much preferred flying over the trees than trudging through the vegetation beneath them. The visible, known temples in Angkor were well-trodden and populated with visitors from all over the world. However, the outlying forest beneath Evans’s ride, lush and green from the air, hid land mines left over from Cambodia’s tumultuous past.

In recent years, archaeologists have used this technology, called LiDAR (a portmanteau of “light” and “radar”), to find ruins of structures—roads, canals, temples, reservoirs—long overgrown with vegetation and hidden from easy observation. LiDAR is revolutionizing what scientists think about the size of ancient cities and how ancient civilizations used the land. It has accelerated the pace of surveying nearly impenetrable areas to a rate that would have been unthinkable just a few years ago.

After 20 hours of flying over two days, the helicopter carrying Evans had surveyed 370 square kilometers (91,400 acres) and Evans had as much data about Angkor’s hidden landforms as he might have gathered during his entire career.

“To achieve the same number of data points as we did with LiDAR would have taken decades on the ground,” Evans says. In addition, Evans says he and his colleagues suspected that previous studies of the area were incomplete. “Our concerns were that previous research had missed three-quarters of the downtown metropolitan part of Angkor.”

The urban sprawl around the temples of Angkor had already been identified as the largest integrated settlement complex of the preindustrial world. However, Evans’s LiDAR map confirmed that existing surveys had been vastly underestimating the size of the formally planned street grid in the central area of the city.

In a 2007 PNAS study (1) that combined ground surveys with airborne radar mapping, Evans and his colleagues first found a chaotic, urban sprawl beyond the city walls of the Angkor Wat complex, and the temples were the center of large, urban landscapes.

In a study based on LiDAR maps, published in PNAS in July 2013 (2), Evans et al. reported that the lasers illuminated in exceptional detail that the now-tangled land outside the temple walls had once, 1,000 years ago, been divided into neat rectangles like city blocks, with canals, ponds, and residences. The urban grids found inside the walls had been built outside, too, covering an area of 36 square kilometers.

“It’s relatively easy to draw a line around temples, but the revelation from LiDAR is that you find this web of subtle traces of urban networks,” Evans says.

The new study suggests the Angkor capitals were much more densely populated than was previously believed, offering more evidence to a growing idea that the Khmer civilization grew so large that it couldn’t grow enough crops to keep up. Overpopulation, combined with the lack of sustainable agricultural methods, may have left the cities vulnerable to decades-long droughts that struck in the 14th and 15th centuries, the same time the Khmer kings abandoned Angkor.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Turning loss into gain: Cutting power could dramatically boost laser output

Turning loss into gain: Cutting power could dramatically boost laser output | Amazing Science |
Lasers – devices that deliver beams of highly organized light – are so deeply integrated into modern technology that their basic operations would seem well understood. CD players, medical diagnostics and military surveillance all depend on lasers.

Re-examining longstanding beliefs about the physics of these devices, Princeton engineers have now shown that carefully restricting the delivery of power to certain areas within a laser could boost its output by many orders of magnitude. The finding, published Oct. 26 in the journal Nature Photonics, could allow far more sensitive and energy-efficient lasers, as well as potentially more control over the frequencies and spatial pattern of light emission.

"It's as though you are using loss to your advantage," said graduate student Omer Malik, an author of the study along with Li Ge, now an assistant professor at the City University of New York, and Hakan Tureci, assistant professor of electrical engineering at Princeton. The researchers said that restricting the delivery of power causes much of the physical space within a laser to absorb rather than produce light. In exchange, however, the optimally efficient portion of the laser is freed from competition with less efficient portions and shines forth far more brightly than previous estimates had suggested.

The results, based on mathematical calculations and computer simulations, still need to be verified in experiments with actual lasers, but the researchers said it represents a new understanding of the fundamental processes that govern how lasers produce light.

"Distributing gain and loss within the material is a higher level of design – a new tool – that had not been used very systematically until now," Tureci said.

The heart of a laser is a material that emits light when energy is supplied to it. When a low level of energy is added, the light is "incoherent," essentially meaning that it contains a mix of wavelengths (or colors). As more energy is added, the material suddenly reaches a "lasing" threshold when it emits coherent light of a particular wavelength.

The entire surface of the material does not emit laser light; rather, if the material is arranged as a disc, for example, the light might come from a ring close to the edge. As even more energy is added, more patterns emerge – for example a ring closer to the center might reach the laser threshold. These patterns – called modes – begin to interact and sap energy from each other. Because of this competition, subsequent modes requiring higher energy may never reach their lasing thresholds. However, Tureci's research group found that some of these higher threshold modes were potentially far more efficient than the earlier ones if they could just be allowed to function without competition.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

The top 100 papers: NATURE magazine explores the most-cited research papers of all time

The top 100 papers: NATURE magazine explores the most-cited research papers of all time | Amazing Science |

The discovery of high-temperature superconductors, the determination of DNA’s double-helix structure, the first observations that the expansion of the Universe is accelerating — all of these breakthroughs won Nobel prizes and international acclaim. Yet none of the papers that announced them comes anywhere close to ranking among the 100 most highly cited papers of all time.

Citations, in which one paper refers to earlier works, are the standard means by which authors acknowledge the source of their methods, ideas and findings, and are often used as a rough measure of a paper’s importance. Fifty years ago, Eugene Garfield published the Science Citation Index (SCI), the first systematic effort to track citations in the scientific literature. To mark the anniversary, Nature asked Thomson Reuters, which now owns the SCI, to list the 100 most highly cited papers of all time. (See the full list at Web of Science Top 100.xls or the interactive graphic, below.) The search covered all of Thomson Reuter’s Web of Science, an online version of the SCI that also includes databases covering the social sciences, arts and humanities, conference proceedings and some books. It lists papers published from 1900 to the present day.

The exercise revealed some surprises, not least that it takes a staggering 12,119 citations to rank in the top 100 — and that many of the world’s most famous papers do not make the cut. A few that do, such as the first observation1 of carbon nanotubes (number 36) are indeed classic discoveries. But the vast majority describe experimental methods or software that have become essential in their fields.

The most cited work in history, for example, is a 1951 paper2 describing an assay to determine the amount of protein in a solution. It has now gathered more than 305,000 citations — a recognition that always puzzled its lead author, the late US biochemist Oliver Lowry.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

DARPA amplifier circuit achieves speeds of 1 trillion Hz, enters Guinness World Records

DARPA amplifier circuit achieves speeds of 1 trillion Hz, enters Guinness World Records | Amazing Science |

Officials from Guinness World Records have recognized DARPA’s Terahertz Electronics program for creating the fastest solid-state amplifier integrated circuit ever measured: one terahertz (1012 GHz), or one trillion cycles per second — 150 billion cycles faster than the existing world record set in 2012.

“This breakthrough could lead to revolutionary technologies such as high-resolution security imaging systems, improved collision-avoidance radar, communications networks with many times the capacity of current systems, and spectrometers that could detect potentially dangerous chemicals and explosives with much greater sensitivity,” said Dev Palmer, DARPA program manager.

Developed by Northrop Grumman Corporation, the Terahertz Monolithic Integrated Circuit (TMIC) exhibits power gains (amplification) several orders of magnitude beyond the current state of the art by using a super-scaled 25 nanometer gate-length indium phosphide high electron mobility transistor.

The TMIC showed a measured gain (on the logarithmic scale) of nine decibels at 1.0 terahertz and eight decibels at 1.03 terahertz. “Nine decibels of gain is unheard of at terahertz frequencies” said Palmer. “This opens up new possibilities for building terahertz radio circuits.”

By contrast, current smartphone technology operates at one to two gigahertz and wireless networks at 5.7 gigahertz.

For years, researchers have been looking to exploit the high-frequency sub-millimeter-wave spectrum beginning above 300 gigahertz. Current electronics using solid-state technologies have largely been unable to access the sub-millimeter band of the electromagnetic spectrum due to insufficient transistor performance.

To address the “terahertz gap,” engineers have traditionally used frequency conversion—converting alternating current at one frequency to alternating current at another frequency—to multiply circuit operating frequencies up from millimeter-wave frequencies.

This approach, however, restricts the output power of electrical devices and adversely affects signal-to-noise ratio. Frequency conversion also increases device size, weight and power supply requirements.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Making designer mutants in all kinds of model organisms

Making designer mutants in all kinds of model organisms | Amazing Science |

Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.

Modern developmental biology was born out of the fruitful marriage between traditional embryology and genetics. Genetic tools, together with advanced microscopy techniques, serve as the most fundamental means for developmental biologists to elucidate the logistics and the molecular control of growth, differentiation and morphogenesis. For this reason, model organisms with sophisticated and comprehensive genetic tools have been highly favored for developmental studies. Advances made in developmental biology using these genetically amenable models have been well recognized. The Nobel prize in Physiology or Medicine was awarded in 1995 to Edward B. Lewis, Christiane Nüsslein-Volhard and Eric F. Wieschaus for their discoveries on the ‘Genetic control of early structural development’ usingDrosophila melanogaster, and again in 2002 to John Sulston, Robert Horvitz and Sydney Brenner for their discoveries of ‘Genetic regulation of development and programmed cell death’ using the nematode worm Caenorhabditis elegans. These fly and worm systems remain powerful and popular models for invertebrate development studies, while zebrafish (Danio rerio), the dual frog species Xenopus laevis and Xenopus tropicalis, rat (Rattus norvegicus), and particularly mouse (Mus musculus) represent the most commonly used vertebrate model systems. To date, random or semi-random mutagenesis (‘forward genetic’) approaches have been extraordinarily successful at advancing the use of these model organisms in developmental studies. With the advent of reference genomic data, however, sequence-specific genomic engineering tools (‘reverse genetics’) enable targeted manipulation of the genome and thus allow previously untestable hypotheses of gene function to be addressed.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Sequenced genomes reveal mutations that disable single genes and can help to identify new drugs

Sequenced genomes reveal mutations that disable single genes and can help to identify new drugs | Amazing Science |
On average, every person carries mutations that inactivate at least one copy of 200 or so genes and both copies of around 20 genes. However, knockout mutations in any particular gene are rare, so very large populations are needed to study their effects. These ‘loss of function’ mutations have long been implicated in certain debilitating diseases, such as cystic fibrosis. Most, however, seem to be harmless — and some are even beneficial to the persons carrying them. “These are people we’re not going to find in a clinic, but they’re still really informative in biology,” says MacArthur.

His group and others had been focusing on genome data, but they are now also starting to mine patient-health records to determine the — sometimes subtle — effects of the mutations. In a study of more than 36,000 Finnish people, published in July (E. T. Lim et al. PLoS Genet. 10, e1004494; 2014), MacArthur and his team discovered that people lacking a gene called LPA might be protected from heart disease, and that another knockout mutation, carried in one copy of a gene by up to 2.4% of Finns, may cause fetuses to miscarry if it is present in both copies.

Bing Yu of the University of Texas Health Science Center in Houston told the meeting how he and his collaborators had compared knockout mutations found in more than 1,300 people with measurements of around 300 molecules in their blood. The team found that mutations in one gene, called SLCO1B1, were linked to high levels of fatty acids, a known risk factor for heart failure. And a team from the Wellcome Trust Sanger Institute in Hinxton, UK, reported that 43 genes whose inactivation is lethal to mice were found to be inactivated in humans who are alive and apparently well.

The poster child for human-knockout efforts is a new class of drugs that block a gene known as PCSK9 (see Nature 496, 152–155; 2013). The gene was discovered in French families with extremely high cholesterol levels in the early 2000s. But researchers soon found that people with rare mutations that inactivate one copy of PCSK9 have low cholesterol and rarely develop heart disease. The first PCSK9-blocking drugs should hit pharmacies next year, with manufacturers jostling for a share of a market that could reach US$25 billion in five years.

“I think there are hundreds more stories like PCSK9 out there, maybe even thousands,” in which a drug can mimic an advantageous loss-of-function mutation, says Eric Topol, director of the Scripps Translational Science Institute in La Jolla, California. Mark Gerstein, a bio­informatician at Yale University in New Haven, Connecticut, predicts that human knockouts will be especially useful for identifying drugs that treat diseases of ageing. “You could imagine there’s a gene that is beneficial to you as a 25-year-old, but the thing is not doing a good job for you when you’re 75.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Drying Amazon Could Be Major Carbon Concern Going Forward

Drying Amazon Could Be Major Carbon Concern Going Forward | Amazing Science |

The lungs of the planet are drying out, threatening to cause Earth to cough up some of its carbon reserves. The Amazon rainforest inhales massive amounts of carbon dioxide from the atmosphere, helping keep the globe’s carbon budget in balance (at least until human emissions started throwing that balance off). But as a new study shows, since 2000 drier conditions are causing a decrease in lung capacity. And if the Amazon’s breaths become more shallow, it’s possible a feedback loop could set in, further reducing lung capacity and throwing the carbon balance further out of whack.

The study, published in the Proceedings of the National Academy of Sciences on Monday, shows that a decline in precipitation has contributed to less healthy vegetation since 2000. “It’s well-established fact that a large part of Amazon is drying. We’ve been able to link that decline in precipitation to a decline in greenness over the last 10 years,” said Thomas Hilker, lead author of the study and forestry expert at Oregon State University.

Since 2000, rainfall has decreased by up to 25 percent across a vast swath of the southeastern Amazon, according to the new satellite analysis by Hilker. The cause of the decline in rainfall hasn’t been pinpointed, though deforestation and changes in atmospheric circulation are possible culprits.

The decrease mostly affected an area of tropical forest 12 times the size of California, as well as adjacent grasslands and other forest types. The browning of that area, which is in the southern Amazon, accounted for more than half the loss of greenness observed by satellites. While the decrease in greenness is comparatively small compared with the overall lushness of the rainforest, the impacts could be outsize.

That’s because the amount of carbon the Amazon stores is staggering. An estimated 120 billion tons of carbon are stashed in its plants and soil. Much of that carbon gets there via the forest flora that suck carbon dioxide out of the atmosphere. Worldwide, “it essentially takes up 25 percent of global carbon cycle that vegetation is responsible for,” Hilker said. “It’s a huge carbon stock.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

255 Terabits/s: Researchers demonstrate record data transmission over new type of fiber

255 Terabits/s: Researchers demonstrate record data transmission over new type of fiber | Amazing Science |

Researchers at Eindhoven University of Technology (TU/e) in the Netherlands and the University of Central Florida (CREOL), report in the journal Nature Photonics the successful transmission of a record high 255 Terabits/s over a new type of fiber allowing 21 times more bandwidth than currently available in communication networks. This new type of fiber could be an answer to mitigating the impending optical transmission capacity crunch caused by the increasing bandwidth demand.

Due to the popularity of Internet services and emerging network of capacity-hungry datacentres, demand for telecommunication bandwidth is expected to continue at an exponential rate. To transmit more information through current optical glass fibers, an option is to increase the power of the signals to overcome the losses inherent in the glass from which the fibre is manufactured. However, this produces unwanted photonic nonlinear effects, which limit the amount of information that can be recovered after transmission over the standard fiber.

The team at TU/e and CREOL, led by dr. Chigo Okonkwo, an assistant professor in the Electro-Optical Communications (ECO) research group at TU/e and dr. Rodrigo Amezcua Correa, a research assistant professor in Micro-structured fibers at CREOL, demonstrate the potential of a new class of fiber to increase transmission capacity and mitigate the impending 'capacity crunch' in their article that appeared yesterday in the online edition of the journal Nature Photonics.

The new fiber has seven different cores through which the light can travel, instead of one in current state-of-the-art fibers. This compares to going from a one-way road to a seven-lane highway. Also, they introduce two additional orthogonal dimensions for data transportation – as if three cars can drive on top of each other in the same lane. Combining those two methods, they achieve a gross transmission throughput of 255 Terabits/s over the fiber link. This is more than 20 times the current standard of 4-8 Terabits/s.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

WIRED: A Brief History of Mind-Bending Ideas About Black Holes

WIRED: A Brief History of Mind-Bending Ideas About Black Holes | Amazing Science |

The story starts in 1784, when a geologist named John Michell was thinking deeply about Isaac Newton’s theory of gravity. In Newtonian physics, a cannonball can be shot into orbit around the Earth if it surpasses a particular speed, known as the planet’s escape velocity.

This speed depends on the mass and radius of the object you are trying to escape from. Michell’s insight was to imagine a body whose escape velocity was so great that it exceeded the speed of light – 300,000 kilometers per second – first measured in 1676 by the Danish astronomer Ole Romer.

Michell presented his results to other scientists, who speculated that massive “dark stars” might exist in abundance in the sky but be invisible because light can’t escape their surfaces. The French mathematician Pierre-Simon Laplace later made an independent discovery of these “dark stars” and both luminaries correctly calculated the very small radius – 6 kilometers – such an object would have if it were as massive as our sun.

After the revolutions of 20th century physics, black holes got much weirder. In 1916, a short while after Einstein published the complex equations underpinning General Relativity (which Einstein himself couldn’t entirely solve), a German astronomer named Karl Schwarzschild showed that a massive object squeezed to a single point would warp space around it so much that even light couldn’t escape. Though the cartoon version of black holes has them sucking everything up like a vacuum cleaner, light would only be unable to escape Schwarzschild’s object if it was inside a particular radius, called the Schwarzschild radius. Beyond this “event horizon,” you could safely leave the vicinity of a black hole.

Neither Schwarzschild nor Einstein believed this object was anything other than a mathematical curiosity. It took a much better understanding of the lives of stars before black holes were taken seriously. You see, a star only works because it preserves a delicate balance between gravity, which is constantly trying to pull its mass inward, and the nuclear furnace in its belly, which exerts pressure outward. At some point a star runs out of fuel and the fusion at its core turns off. Gravity is given the upper hand, causing the star to collapse. For stars like our sun, this collapse is halted when the electrons in the star’s atoms get so close that they generate a quantum mechanical force called electron degeneracy pressure. An object held up by this pressure is called a white dwarf.

In 1930, the Indian physicist Subrahmanyan Chandrasekhar showed that, given enough mass, a star’s gravity could overcome this electron degeneracy pressure, squeezing all its protons and electrons into neutrons. Though a neutron degeneracy pressure could then hold the weight up, forming a neutron star, the physicist Robert Oppenheimer found that an even more massive object could overcome this final outward pressure, allowing gravity to win and crushing everything down to a single point. Scientists slowly accepted that these things were real objects, not just weird mathematical solutions to the equations of General Relativity. In 1967, physicist John Wheeler used the term “black hole” to describe them in a public lecture, a name that has stuck ever since.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Breakthrough in molecular electronics paves the way for DNA-based computer circuits in the future

Breakthrough in molecular electronics paves the way for DNA-based computer circuits in the future | Amazing Science |

Molecular electronics, which uses molecules as building blocks for the fabrication of electronic components, was seen as the ultimate solution to the miniaturization challenge. However, to date, no one has actually been able to make complex electrical circuits using molecules. The only known molecules that can be pre-designed to self-assemble into complex miniature circuits, which could in turn be used in computers, are DNA molecules. Nevertheless, so far no one has been able to demonstrate reliably and quantitatively the flow of electrical current through long DNA molecules.

Now, an international group led by Prof. Danny Porath, the Etta and Paul Schankerman Professor in Molecular Biomedicine at the Hebrew University of Jerusalem, reports reproducible and quantitative measurements of electricity flow through long molecules made of four DNA strands, signaling a significant breakthrough towards the development of DNA-based electrical circuits. The research, which could re-ignite interest in the use of DNA-based wires and devices in the development of programmable circuits, appears in the prestigious journal Nature Nanotechnology under the title "Long-range charge transport in single G-quadruplex DNA molecules."

Prof. Porath is affiliated with the Hebrew University's Institute of Chemistry and its Center for Nanoscience and Nanotechnology. The molecules were produced by the group of Alexander Kotlyar from Tel Aviv University, who has been collaborating with Porath for 15 years. The measurements were performed mainly by Gideon Livshits, a PhD student in the Porath group, who carried the project forward with great creativity, initiative and determination. The research was carried out in collaboration with groups from Denmark, Spain, US, Italy and Cyprus.

According to Prof. Porath, "This research paves the way for implementing DNA-based programmable circuits for molecular electronics, which could lead to a new generation of computer circuits that can be more sophisticated, cheaper and simpler to make."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

IBM Watson uses cognitive technologies to help finding new sources of oil

IBM Watson uses cognitive technologies to help finding new sources of oil | Amazing Science |

Scientists at IBM and Repsol SA, Spain largest energy company, announced today (Oct. 30) the world’s first research collaboration using cognitive technologies like IBM’s Watson to jointly develop and apply new tools to make it cheaper and easier to find new oil fields.

An engineer will typically have to manually read through an enormous set of journal papers and baseline reports with models of reservoir, well, facilities, production, export, and seismic imaging data.

IBM says its cognitive technologies could help by analyzing hundreds of thousands of papers, prioritize data, and link that data to the specific decision at hand. It will introduce “new real-time factors to be considered, such as current news events around economic instability, political unrest, and natural disasters.”

The oil and gas industry boasts some of the most advanced geological, geophysical and chemical science in the world. But the challenge is to integrate critical geopolitical, economic, and other global news into decisions. And that will require a whole new approach to computing that can speed access to business insights, enhance strategic decision-making, and drive productivity, IBM says.

This goes beyond the capabilities of Watson. But scientists at IBM’s Cognitive Environments Laboratory (CEL), collaborating with Repsol, plan to develop and apply new prototype cognitive tools for real-world use cases in the oil and gas industry. They will experiment with a combination of traditional and new interfaces based upon spoken dialog, gesture, robotics and advanced visualization and navigation techniques.

The objective is build conceptual and geological models, highlight the impact of the potential risks and uncertainty, visualize trade-offs, and explore what-if scenarios to ensure the best decision is made, IBM says.

Repsol is making an initial investment of $15 million to $20 million to develop two applications targeted for next year, Repsol’s director for exploration and production technology Santiago Quesada explained to Bloomberg Business Week. “One app will be used for oil exploration and the other to help determine the most attractive oil and gas assets to buy.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Giant tortoises regain foothold on a Galapagos island - from 15 to over 1,000 breeding on their own again

Giant tortoises regain foothold on a Galapagos island - from 15 to over 1,000  breeding on their own again | Amazing Science |

An endangered population of giant tortoises has recovered on the Galapagos island of Espanola.

Some 40 years after the first captive-bred tortoises were reintroduced to the island by the Galapagos National Park Service, the endemic Española giant tortoises are reproducing and restoring some of the ecological damage caused by feral goats that were brought to the island in the late 19th century.

"The global population was down to just 15 tortoises by the 1960s. Now there are some 1,000 tortoises breeding on their own. The population is secure. It's a rare example of how biologists and managers can collaborate to recover a species from the brink of extinction, " said James P. Gibbs, a professor of vertebrate conservation biology at the SUNY College of Environmental Science and Forestry (ESF) and lead author of the paper published in the journal PLOS ONE.

Gibbs and his collaborators assessed the tortoise population using 40 years of data from tortoises marked and recaptured repeatedly for measurement and monitoring by members of the Galapagos National Park Service, Charles Darwin Foundation, and visiting scientists.

But there is another side to the success story: while the tortoise population is stable, it is not likely to increase until more of the landscape recovers from the damage inflicted by the now-eradicated goats. After the goats devoured all the grassy vegetation and were subsequently removed from the island, more shrubs and small trees have grown on Española. This hinders both the growth of cactus, which is a vital piece of a tortoise's diet, and the tortoises' movement. Chemical analysis of the soil, done by Dr. Mark Teece, an ESF chemistry professor, shows there has been a pronounced shift from grasses to woody plants on the island in the last 100 years.


  1. James P. Gibbs, Elizabeth A. Hunter, Kevin T. Shoemaker, Washington H. Tapia, Linda J. Cayot. Demographic Outcomes and Ecosystem Implications of Giant Tortoise Reintroduction to Española Island, GalapagosPLoS ONE, 2014; 9 (10): e110742 DOI: 10.1371/journal.pone.0110742

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Electron wave function is split by tunnelling into different regions

Electron wave function is split by tunnelling into different regions | Amazing Science |

New research by physicists from Brown University puts the profound strangeness of quantum mechanics in a nutshell—or, more accurately, in a helium bubble.

Experiments led by Humphrey Maris, professor of physics at Brown, suggest that the quantum state of an electron—the electron's wave function—can be shattered into pieces and those pieces can be trapped in tiny bubbles of liquid helium. To be clear, the researchers are not saying that the electron can be broken apart. Electrons are elementary particles, indivisible and unbreakable. But what the researchers are saying is in some ways more bizarre.

In quantum mechanics, particles do not have a distinct position in space. Instead, they exist as a wave function, a probability distribution that includes all the possible locations where a particle might be found. Maris and his colleagues are suggesting that parts of that distribution can be separated and cordoned off from each other.

"We are trapping the chance of finding the electron, not pieces of the electron," Maris said. "It's a little like a lottery. When lottery tickets are sold, everyone who buys a ticket gets a piece of paper. So all these people are holding a chance and you can consider that the chances are spread all over the place. But there is only one prize—one electron—and where that prize will go is determined later."

If Maris's interpretation of his experimental findings is correct, it raises profound questions about the measurement process in quantum mechanics. In the traditional formulation of quantum mechanics, when a particle is measured—meaning it is found to be in one particular location—the wave function is said to collapse.

"The experiments we have performed indicate that the mere interaction of an electron with some larger physical system, such as a bath of liquid helium, does not constitute a measurement," Maris said. "The question then is: What does?"

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Isotope effect produces new type of chemical bond - the vibrational muonium bond

Isotope effect produces new type of chemical bond - the vibrational muonium bond | Amazing Science |

Researchers believe they have confirmed the existence of a new type of chemical bond, first proposed some 30 years ago but never convincingly demonstrated because of the lack of experimental evidence and the relatively poor accuracy of the quantum chemistry methods that prevailed at the time.1 The new work also shows how substituting isotopes can result in fundamental changes in the nature of chemical bonding.

In the early 1980s it was proposed that in certain transition states consisting of a very light atom sandwiched between two heavy ones, the system would be stabilised not by conventional van der Waal’s forces, but by vibrational bonding, with the light atom shuttling between its two neighbours. However, despite several groups searching for such a system none was demonstrated and the hunt fizzled out.

Now, Jörn Manz, of the Free University of Berlin and Shanxi University in China, and colleagues believe they have the theoretical and experimental evidence to demonstrate a stable vibrational bond.

The researchers carried out a series of theoretical experiments looking at the reaction of BrH with Br to create the radical BrHBr, but using different isotopes of hydrogen. By using muons – elementary particles that are similar to an electron but have greater mass – the team added a range of hydrogen isotopes to BrHBr from the relatively hefty muonic helium4H, to the extremely light muonium, Mu, with a mass nearly 40 times smaller than 4H.

The team mapped two key parameters: the potential energy surface of the system – the three-dimensional potential energy ‘landscape’ relating the energy of the surface, with hills and valleys – to the geometry; and a quantum mechanical parameter, the vibrational zero point energy or ZPE.

Classically, a bond will form if there is a net reduction in the potential energy of the system. However, in certain circumstances, if there is a sufficiently large decrease in the vibrational ZPE, this can overcome the need for a decrease in potential energy and the system can be stabilised by a vibrational bond.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New solar power material converts 90 percent of captured light into heat

New solar power material converts 90 percent of captured light into heat | Amazing Science |

A multidisciplinary engineering team at the University of California, San Diego developed a new nanoparticle-based material for concentrating solar power plants designed to absorb and convert to heat more than 90 percent of the sunlight it captures. The new material can also withstand temperatures greater than 700 degrees Celsius and survive many years outdoors in spite of exposure to air and humidity. Their work, funded by the U.S. Department of Energy's SunShot program, was published recently in two separate articles in the journal Nano Energy.

By contrast, current solar absorber material functions at lower temperatures and needs to be overhauled almost every year for high temperature operations. "We wanted to create a material that absorbs sunlight that doesn't let any of it escape. We want the black hole of sunlight," said Sungho Jin, a professor in the department of Mechanical and Aerospace Engineering at UC San Diego Jacobs School of Engineering. Jin, along with professor Zhaowei Liu of the department of Electrical and Computer Engineering, and Mechanical Engineering professor Renkun Chen, developed the Silicon boride-coated nanoshell material. They are all experts in functional materials engineering.

The novel material features a "multiscale" surface created by using particles of many sizes ranging from 10 nanometers to 10 micrometers. The multiscale structures can trap and absorb light which contributes to the material's high efficiency when operated at higher temperatures.

Concentrating solar power (CSP) is an emerging alternative clean energy market that produces approximately 3.5 gigawatts worth of power at power plants around the globe—enough to power more than 2 million homes, with additional construction in progress to provide as much as 20 gigawatts of power in coming years. One of the technology's attractions is that it can be used to retrofit existing power plants that use coal or fossil fuels because it uses the same process to generate electricity from steam.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Google X project plans to use magnetic nanoparticles and wearable sensor to detect diseases

Google X project plans to use magnetic nanoparticles and wearable sensor to detect diseases | Amazing Science |

Google announced a new “Nanoparticle Platform” project Tuesday to develop medical diagnostic technology using nanoparticles, Andrew Conrad, head of the Google X Life Sciences team, disclosed at The Wall Street Journal’s WSJD Live conference. The idea is to use nanoparticles with magnetic cores circulating in the bloodstream with recognition molecules to detect cancer, plaques, or too much sodium, for example.

There are a number of similar research projects using magnetic (and other) nanoparticles in progress, as reported onKurzweilAI. What’s new in the Google project is delivering nanoparticles to the bloodstream via a pill and using a wearable wrist detector to detect the nanoparticles’ magnetic field and read out diagnostic results.

But this is an ambitious moonshot project. “Google is at least five to seven years away from a product approved for use by doctors,” said Sam Gambhir, chairman of radiology at Stanford University Medical School, who has been advising Dr. Conrad on the project for more than a year, the WSJ reports.

“Even if Google can make the system work, it wouldn’t immediately be clear how to interpret the results. That is why Dr. Conrad’s team started the Baseline study [see “New Google X Project to look for disease and health patterns in collected data”], which he hopes will create a benchmark for comparisons.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Lights out: study shows urgent need to address instability of world's power grid

Lights out: study shows urgent need to address instability of world's power grid | Amazing Science |

Research by Hugh Byrd, Professor of Architecture at the University of Lincoln, UK, and Steve Matthewman, Associate Professor of Sociology at the University of Auckland, New Zealand, highlights the insecurities of power systems and weakening electrical infrastructure across the globe, particularly in built-up urban areas.

The work builds on previous studies which examined a sharp increase in electrical usage over recent years, and warned the world to prepare for the prospect of coping without electricity as instances of complete power failure become increasingly common.

Professor Byrd explained: “We have previously highlighted that demand for new technology continues to grow at an unprecedented rate. Our new research emphasizes why energy sources are becoming increasingly inadequate, and simply cannot continue to meet this demand.

“Throughout our study, we observed a number of network failures due to inadequate energy, whether through depletion of resources such as oil and coal, or through the vagaries of the climate in the creation of renewable energy.”

The British energy regulator Ofgem has predicted a fall in spare electrical power production capacity to two per cent by 2015, meaning there is now even less flexibility of supply to adjust to spikes in demand. 

The issue of energy security exists for countries which have access to significant renewable power supplies too. With rain, wind and sunshine becoming less predictable due to changes brought about by global warming, the new research found that severe blackouts in Kenya, India, Tanzania and Venezuela, which all occurred during the last decade, were caused by shortages of rain in hyrdro-dams.

Further to the irregularities involved in renewable power generation, the study concludes that worldwide electricity supply will also become increasingly precarious due to industry privatization and neglect of infrastructure.

Professor Matthewman said: “Over the past two decades, deregulation and privatization have become major global trends within the electrical power industry. In a competitive environment, reliability and profits may be at cross-purposes — single corporations can put their own interests ahead of the shared grid, and spare capacity is reduced in the name of cost saving. There is broad consensus among energy specialists, national advisory bodies, the reinsurance industry, and organizational sociologists that this has exacerbated blackout risk.”

These trends have seen the separation of power generation, transmission and distribution services – a process which Professors Byrd and Matthewman suggest only opens up more opportunity for electrical disruption. Their study reveals the difficulties that arise when different technical and human systems need to communicate, and points to a breakdown in this type of communication as the main  cause behind two of the world’s worst ever blackouts – from Ohio, USA, to Ontario, Canada, in 2003; and across Italy and neighboring nations in the same year. Together, these power failures affected more than 100 million people.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

San Diego company develops 10-minute $10 Ebola test

San Diego company develops 10-minute $10 Ebola test | Amazing Science |

With a single prick and a single drop of blood, a San Diego company claims they can now detect if a patient has Ebola in less than 10 minutes. The breakthrough technology is called “Ebola Plus,” a tool that can be used to detect Ebola on anyone, anywhere in the world.

“We can do that for a large number of tests simultaneously with just one drop of blood,” said Dr. Cary Gunn, Ph.D. and CEO and Genalyte. Once blood is drawn, a silicon chip is used to detect the virus as blood flows over it.

Researchers at Genalyte have been working on the diagnostic tool for seven years, using it to test for various diseases, and only recently discovered it could also work to spot Ebola. “It allows you to screen more patients more rapidly. The biggest question right now is the debate about quarantine.

Instead of asking people to take their fever once or twice a day, they can just take a prick of blood,” said Dr. Gunn. It can analyze up to 100 samples per hour, and be administered anywhere including, hospitals, airports, and even remote areas in West Africa where the disease is spreading rapidly. “Right now, most people in Liberia aren’t even being tested. People who have suspicion of having Ebola are being checked into wards. The ability to take a prick of blood and do the test would be a game changer in that environment,” said Gunn.

Developing the platform for the test cost Genalyte around $100,000, but each chip that will be used during the tests costs $10 each – making early detection cheaper and easier for caretakers. Currently, the FDA has only approved for P-C-R that can take two hours for results, compared to the Ebola Plus that can provide results in ten minutes.

Scooped by Dr. Stefan Gruenwald!

Rapid Evolution of Anole Populations in Real Time

Rapid Evolution of Anole Populations in Real Time | Amazing Science |

On islands off the coast of Florida, scientists uncover swift adaptive changes among Carolina anole populations, whose habitats were disturbed by the introduction of another lizard species.

For most of its existence, the Carolina anole (Anolis carolinensis) was the only lizard in the southwestern U.S. It could perch where it wanted, eat what it liked. But in the 1970s, aided by human pet trade, the brown anole (Anolis sagrei)—native to Cuba and the Bahamas—came marching in. In experiments on islands off the coast of Florida, scientists studying the effects of the species mixing witnessed evolution in action: the Carolina anole started perching higher up in trees, and its toe pads changed to enable better grip—all in a matter of 15 years, or about 20 lizard generations.

In a paper published in Science today (October 23),Yoel Stuart of the University of Texas at Austin, Todd Campbell from the University of Tampa, Florida, and their colleagues discuss what happened when the two species converged upon the same habitats.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Fact or Fiction?: Mammoths Can Be Brought Back from Extinction

Fact or Fiction?: Mammoths Can Be Brought Back from Extinction | Amazing Science |

In a petri dish in the bowels of Harvard Medical School scientists have tweaked three genes from the cells of an Asian elephant that help control the production of hemoglobin, the protein in blood that carries oxygen. Their goal is to make these genes more like those of an animal that last walked the planet thousands of years ago: the woolly mammoth.

"Asian elephants are closer to mammoths than either is to African elephants, yet quite different in appearance and temperature range," notes Harvard geneticist and technology developer George Church. "We are not trying to make an exact copy of a mammoth, but rather a cold-resistant elephant."
But what if the new—and fast advancing—techniques of genome editing allowed scientists to engineer not only cold-resistance traits but also other characteristics of the woolly mammoth into its living Asiatic relatives? Scientists have found mammoth cells preserved in permafrost. If they were to recover cells with intact DNA, they could theoretically “edit” an Asian elephant’s genome to match the woolly mammoth’s. A single cell contains the complete genetic instruction set for its species, and by replicating that via editing a new individual can, theoretically, be created. But wouldsuch a hybrid—scion of an Asian elephant mother and genetic tinkerers—count as a true woolly mammoth?
In other words, is de-extinction a real possibility?
The answer is yes. On January 6, 2000, a falling tree killed the last bucardo, a wild Iberian ibex, which is a goatlike animal. Her name was Celia. On July 30, 2003, Celia's clone was born. To make the clone scientists removed the nucleus of a cell from Celia intact and inserted it into the unfertilized egg cell of another kind of ibex. They then transferred the resulting embryo to the womb of a living goat. Nearly a year later theydelivered the clone by cutting her from her mother.
Although she lived for a scant seven minutes due to lung defects, Celia’s clone proved that not only is de-extinction real, "it has already happened," in the words of environmentalist Stewart Brand, whose San Francisco-based Long Now Foundation is funding some of this de-extinction research, including Church's effort as well as bids to bring back the passenger pigeon and heath hen, among other candidate species. Nor is the bucardo alone in the annals of de-extinction. Several viruses have already been brought back, including the flu variant responsible for the 1918 pandemic that killed more than 20 million people worldwide.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New compounds decrease inflammation associated with ulcerative colitis, arthritis and multiple sclerosis

New compounds decrease inflammation associated with ulcerative colitis, arthritis and multiple sclerosis | Amazing Science |

Six Case Western Reserve scientists are part of an international team that has discovered two compounds that show promise in decreasing inflammation associated with diseases such as ulcerative colitis, arthritis and multiple sclerosis. The compounds, dubbed OD36 and OD38, specifically appear to curtail inflammation-triggering signals from RIPK2 (serine/threonine/tyrosine kinase 2). RIPK2 is an enzyme that activates high-energy molecules to prompt the immune system to respond with inflammation. The findings of this research appear in the Journal of Biological Chemistry.

“This is the first published indication that blocking RIPK2 might be efficacious in inflammatory disease,” said senior author Derek Abbott, MD, PhD, associate professor of pathology, Case Western Reserve University School of Medicine. “Our data provides a strong rationale for further development and optimization of RIPK2-targeted pharmaceuticals and diagnostics.”

In addition to Abbott and his medical school colleagues, the research team included representatives of Oncodesign, a therapeutic molecule biotechnology company in Dijon, France; Janssen Research & Development, a New Jersey-based pharmaceutical company; and Asclepia Outsourcing Solutions, a Belgium-based medicinal chemistry company.

The normal function of RIPK2 is to send warning signals to cells that bacterial infection has occurred, which in turn spurs the body to mobilize white blood cells. The white blood cells identify and encircle pathogens, which cause blood to accumulate in the region. It is this blood build-up that leads to the red and swollen areas characteristic of inflammation. When this process goes awry, the inflammation increases dramatically and tissue destruction ensues. RIPK2 works in conjunction with NOD1 and NOD2 (nucleotide-binding oligomerization domain) proteins in controlling responses by the immune system that lead to this inflammation process.

In this research project, investigators applied state-of-the-art genetic sequencing to learn the unique set of genes driven specifically by NOD2 proteins. They ultimately zeroed in on three specific NOD2-driven inflammation genes (SLC26a, MARCKSL1, and RASGRP1) that guided investigators in finding the most effective compounds.

Oncodesign searched its library of 4,000 compounds that targeted kinases, and after exhaustive study, narrowed the selection down to 13. Then investigators tested the 13 compounds in mouse and human cells and found that two compounds, OD36 and OD38, were most effective in blocking RIPK2. 

“Based on the design of OD36 and OD38, we have developed with Oncodesign fifth-generation compounds that are even more effective than the first-generation OD36 and OD38,” Abbott said. “Our next step is to seek a larger pharmaceutical company that can move these compounds forward into Phase 1 clinical trials in humans.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Faster switching helps ferroelectrics become viable replacement for transistors

Faster switching helps ferroelectrics become viable replacement for transistors | Amazing Science |

Ferroelectric materials – commonly used in transit cards, gas grill igniters, video game memory and more – could become strong candidates for use in next-generation computers, thanks to new research led by scientists at the University of California, Berkeley, and the University of Pennsylvania.

The researchers found an easy way to improve the performance of ferroelectric materials in a way that makes them viable candidates for low-power computing and electronics. They described their work in a study published today (Sunday, Oct. 26) in the journal Nature Materials.

Ferroelectric materials have spontaneous polarization as a result of small shifts of negative and positive charges within the material. A key characteristic of these materials is that the polarization can be reversed in response to an electric field, enabling the creation of a “0” or “1” data bit for memory applications. Ferroelectrics can also produce an electric charge in response to physical force, such as being pressed, squeezed or stretched, which is why they are found in applications such as push-button igniters on portable gas grills.

“What we discovered was a fundamentally new and unexpected way for these ferroelectric materials to respond to applied electric fields,” said study principal investigator Lane Martin, UC Berkeley associate professor of materials science and engineering. “Our discovery opens up the possibility for faster switching and new control over novel, never-before-expected multi-state devices.”

Martin and other UC Berkeley researchers partnered with a team led by Andrew Rappe, University of Pennsylvania professor of chemistry and of materials science and engineering. UC Berkeley graduate student Ruijuan Xu led the study’s experimental design, and Penn graduate student Shi Liu led the study’s theoretical modeling.

Scientists have turned to ferroelectrics as an alternative form of data storage and memory because the material holds a number of advantages over conventional semiconductors. For example, anyone who has ever lost unsaved computer data after power is unexpectedly interrupted knows that today’s transistors need electricity to maintain their “on” or “off” state in an electronic circuit.

Because ferroelectrics are non-volatile, they can remain in one polarized state or another without power. This ability of ferroelectric materials to store memory without continuous power makes them useful for transit cards, such as the Clipper cards used to pay fare in the Bay Area, and in certain memory cards for consumer electronics. If used in next-generation computers, ferroelectrics would enable the retention of information so that data would be there if electricity goes out and then is restored.

“If we could integrate these materials into the next generation of computers, people wouldn’t lose their data if the power goes off,” said Martin, who is also a faculty scientist at the Lawrence Berkeley National Laboratory. “For an individual, losing unsaved work is an inconvenience, but for large companies like eBay, Google and Amazon, losing data is a significant loss of revenue.”

So what has held ferroelectrics back from wider use as on/off switches in integrated circuits? The answer is speed, according to the study authors.

No comment yet.