The Internet’s Population Doubled Over the Last Five Years Royal Pingdom susses out some interesting trends about the world’s 2.27 billion Internet users.
Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
From recognizing speech to identifying unusual stars, new discoveries often begin with comparison of data streams to find connections and spot outliers. But simply feeding raw data into a data-analysis algorithm is unlikely to produce meaningful results, say the authors of a new Cornell study. That’s because most data comparison algorithms today have one major weakness: somewhere, they rely on a human expert to specify what aspects of the data are relevant for comparison, and what aspects aren’t.
But these experts can’t keep up with the growing amounts and complexities of big data. So the Cornell computing researchers have come up with a new principle they call “data smashing” for estimating the similarities between streams of arbitrary data without human intervention, and even without access to the data sources.
Data smashing is based on a new way to compare data streams. The process involves two steps.
Data-smashing principles could open the door to understanding increasingly complex observations, especially when experts don’t know what to look for, according to the researchers. The researchers— Hod Lipson, associate professor of mechanical engineering and of computing and information science, and Ishanu Chattopadhyay, a former postdoctoral associate with Lipson now at the University of Chicago — demonstrated this idea with data from real-world problems, including detection of anomalous cardiac activity from heart recordings and classification of astronomical objects from raw photometry.
In all cases and without access to original domain knowledge, the researchers demonstrated that the performance of these general algorithms was on par with the accuracy of specialized algorithms and heuristics tweaked by experts to work.
For almost 400 years, mercury gauges have prevailed as the most accurate way to measure pressure. Now, within weeks of seeing "first light," a novel pressure-sensing device has surpassed the performance of the best mercury-based techniques in resolution, speed, and range at a fraction of the size. The new instrument, called a fixed-length optical cavity (FLOC), works by detecting subtle changes in the wavelength of light passing through a cavity filled with nitrogen gas.
The FLOC system is poised to depose traditional mercury pressure sensors – also called manometers – as the standard used to calibrate commercial equipment, says the interdisciplinary team of NIST researchers who developed the system and will continue to refine it over the next few years. The new design is also a promising candidate for a factory-floor pressure instrument that could be used by a range of industries, including those associated with semiconductor, glass, and aerospace manufacturing.
"We've exceeded the expectations we had three years ago," says Thermodynamic Metrology Group Leader Gregory Strouse. "This device is not only a photonic sensor, it's also a primary standard. It's the first photonic-based primary pressure standard. And it works."
About the size of a travel mug, the FLOC has a resolution of 0.1 mPa (millipascal or thousandths of a pascal), 36 times better than NIST'S official U.S. pressure standard, which is a 3-meter-tall (about 10-foot) column of liquid mercury that extends through the ceiling of the calibration room.
It’s not such a stretch to think that humans can catch the Ebola virus from monkeys and the flu virus from pigs. After all, they are all mammals with fundamentally similar physiologies. But now researchers have discovered that even a virus found in the lowly algae can make mammals its home. The invader doesn’t make people or mice sick, but it does seem to slow specific brain activities.
The virus, called ATCV-1, showed up in human brain tissue several years ago, but at the time researchers could not be sure whether it had entered the tissue before or after the people died. Then, it showed up again in a survey of microbes and viruses in the throats of people with psychiatric disease. Pediatric infectious disease expert Robert Yolken from Johns Hopkins University School of Medicine in Baltimore, Maryland, and his colleagues were trying to see if pathogens play a role in these conditions. At first, they didn't know what ATCV-1 was, but a database search revealed its identity as a virus that typically infects a species of green algae found in lakes and rivers.
The researchers wanted to find out if the virus was in healthy people as well as sick people. They checked for it in 92 healthy people participating in a study of cognitive function and found it in 43% of them. What’s more, those infected with the virus performed 10% worse than uninfected people on tests requiring visual processing. They were slower in drawing a line connecting a sequence of numbers randomly placed on a page, for example. And they seemed to have shorter attention spans, the researchers report online today in the Proceedings of the National Academy of Sciences. The effects were modest, but significant.
The slower brain function was not associated with any differences in sex, income or education level, race, place of birth, or cigarette smoking. But that doesn't necessarily mean the virus causes cognitive decline; it might just benefit from some other factor that impairs the brain in some people, such as other infectious agents, heavy metals, or pollutants, the researchers say.
To test for causality, the team injected uninfected and infected green algae into the mouths of mice. (They could tell that the mice became infected with the virus because they developed antibodies to it.) Infected and uninfected mice underwent a battery of tests. The two groups were about on par with how well they moved, but infected animals took 10% longer to find their way out of mazes and spent 20% less time exploring new objects—indications that they had poorer attention spans and were not as good at remembering their surroundings.
Archaeologists are known for getting their hands—and everything else—dirty. However, in April 2012, archaeologist Damian Evans boarded a helicopter and spent hours being flown over the dense foliage surrounding Angkor Wat, Cambodia’s legendary complex of ancient stone temples. The heat inside the helicopter was intense, reaching upward of 40 °C, but Evans, a faculty member of the University of Sydney based in Cambodia, much preferred flying over the trees than trudging through the vegetation beneath them. The visible, known temples in Angkor were well-trodden and populated with visitors from all over the world. However, the outlying forest beneath Evans’s ride, lush and green from the air, hid land mines left over from Cambodia’s tumultuous past.
In recent years, archaeologists have used this technology, called LiDAR (a portmanteau of “light” and “radar”), to find ruins of structures—roads, canals, temples, reservoirs—long overgrown with vegetation and hidden from easy observation. LiDAR is revolutionizing what scientists think about the size of ancient cities and how ancient civilizations used the land. It has accelerated the pace of surveying nearly impenetrable areas to a rate that would have been unthinkable just a few years ago.
After 20 hours of flying over two days, the helicopter carrying Evans had surveyed 370 square kilometers (91,400 acres) and Evans had as much data about Angkor’s hidden landforms as he might have gathered during his entire career.
“To achieve the same number of data points as we did with LiDAR would have taken decades on the ground,” Evans says. In addition, Evans says he and his colleagues suspected that previous studies of the area were incomplete. “Our concerns were that previous research had missed three-quarters of the downtown metropolitan part of Angkor.”
The urban sprawl around the temples of Angkor had already been identified as the largest integrated settlement complex of the preindustrial world. However, Evans’s LiDAR map confirmed that existing surveys had been vastly underestimating the size of the formally planned street grid in the central area of the city.
In a 2007 PNAS study (1) that combined ground surveys with airborne radar mapping, Evans and his colleagues first found a chaotic, urban sprawl beyond the city walls of the Angkor Wat complex, and the temples were the center of large, urban landscapes.
In a study based on LiDAR maps, published in PNAS in July 2013 (2), Evans et al. reported that the lasers illuminated in exceptional detail that the now-tangled land outside the temple walls had once, 1,000 years ago, been divided into neat rectangles like city blocks, with canals, ponds, and residences. The urban grids found inside the walls had been built outside, too, covering an area of 36 square kilometers.
“It’s relatively easy to draw a line around temples, but the revelation from LiDAR is that you find this web of subtle traces of urban networks,” Evans says.
The new study suggests the Angkor capitals were much more densely populated than was previously believed, offering more evidence to a growing idea that the Khmer civilization grew so large that it couldn’t grow enough crops to keep up. Overpopulation, combined with the lack of sustainable agricultural methods, may have left the cities vulnerable to decades-long droughts that struck in the 14th and 15th centuries, the same time the Khmer kings abandoned Angkor.
Lasers – devices that deliver beams of highly organized light – are so deeply integrated into modern technology that their basic operations would seem well understood. CD players, medical diagnostics and military surveillance all depend on lasers.
"It's as though you are using loss to your advantage," said graduate student Omer Malik, an author of the study along with Li Ge, now an assistant professor at the City University of New York, and Hakan Tureci, assistant professor of electrical engineering at Princeton. The researchers said that restricting the delivery of power causes much of the physical space within a laser to absorb rather than produce light. In exchange, however, the optimally efficient portion of the laser is freed from competition with less efficient portions and shines forth far more brightly than previous estimates had suggested.
The results, based on mathematical calculations and computer simulations, still need to be verified in experiments with actual lasers, but the researchers said it represents a new understanding of the fundamental processes that govern how lasers produce light.
"Distributing gain and loss within the material is a higher level of design – a new tool – that had not been used very systematically until now," Tureci said.
The heart of a laser is a material that emits light when energy is supplied to it. When a low level of energy is added, the light is "incoherent," essentially meaning that it contains a mix of wavelengths (or colors). As more energy is added, the material suddenly reaches a "lasing" threshold when it emits coherent light of a particular wavelength.
The entire surface of the material does not emit laser light; rather, if the material is arranged as a disc, for example, the light might come from a ring close to the edge. As even more energy is added, more patterns emerge – for example a ring closer to the center might reach the laser threshold. These patterns – called modes – begin to interact and sap energy from each other. Because of this competition, subsequent modes requiring higher energy may never reach their lasing thresholds. However, Tureci's research group found that some of these higher threshold modes were potentially far more efficient than the earlier ones if they could just be allowed to function without competition.
The discovery of high-temperature superconductors, the determination of DNA’s double-helix structure, the first observations that the expansion of the Universe is accelerating — all of these breakthroughs won Nobel prizes and international acclaim. Yet none of the papers that announced them comes anywhere close to ranking among the 100 most highly cited papers of all time.
Citations, in which one paper refers to earlier works, are the standard means by which authors acknowledge the source of their methods, ideas and findings, and are often used as a rough measure of a paper’s importance. Fifty years ago, Eugene Garfield published the Science Citation Index (SCI), the first systematic effort to track citations in the scientific literature. To mark the anniversary, Nature asked Thomson Reuters, which now owns the SCI, to list the 100 most highly cited papers of all time. (See the full list at Web of Science Top 100.xls or the interactive graphic, below.) The search covered all of Thomson Reuter’s Web of Science, an online version of the SCI that also includes databases covering the social sciences, arts and humanities, conference proceedings and some books. It lists papers published from 1900 to the present day.
The exercise revealed some surprises, not least that it takes a staggering 12,119 citations to rank in the top 100 — and that many of the world’s most famous papers do not make the cut. A few that do, such as the first observation1 of carbon nanotubes (number 36) are indeed classic discoveries. But the vast majority describe experimental methods or software that have become essential in their fields.
The most cited work in history, for example, is a 1951 paper2 describing an assay to determine the amount of protein in a solution. It has now gathered more than 305,000 citations — a recognition that always puzzled its lead author, the late US biochemist Oliver Lowry.
Officials from Guinness World Records have recognized DARPA’s Terahertz Electronics program for creating the fastest solid-state amplifier integrated circuit ever measured: one terahertz (1012 GHz), or one trillion cycles per second — 150 billion cycles faster than the existing world record set in 2012.
“This breakthrough could lead to revolutionary technologies such as high-resolution security imaging systems, improved collision-avoidance radar, communications networks with many times the capacity of current systems, and spectrometers that could detect potentially dangerous chemicals and explosives with much greater sensitivity,” said Dev Palmer, DARPA program manager.
Developed by Northrop Grumman Corporation, the Terahertz Monolithic Integrated Circuit (TMIC) exhibits power gains (amplification) several orders of magnitude beyond the current state of the art by using a super-scaled 25 nanometer gate-length indium phosphide high electron mobility transistor.
The TMIC showed a measured gain (on the logarithmic scale) of nine decibels at 1.0 terahertz and eight decibels at 1.03 terahertz. “Nine decibels of gain is unheard of at terahertz frequencies” said Palmer. “This opens up new possibilities for building terahertz radio circuits.”
By contrast, current smartphone technology operates at one to two gigahertz and wireless networks at 5.7 gigahertz.
For years, researchers have been looking to exploit the high-frequency sub-millimeter-wave spectrum beginning above 300 gigahertz. Current electronics using solid-state technologies have largely been unable to access the sub-millimeter band of the electromagnetic spectrum due to insufficient transistor performance.
To address the “terahertz gap,” engineers have traditionally used frequency conversion—converting alternating current at one frequency to alternating current at another frequency—to multiply circuit operating frequencies up from millimeter-wave frequencies.
This approach, however, restricts the output power of electrical devices and adversely affects signal-to-noise ratio. Frequency conversion also increases device size, weight and power supply requirements.
Recent advances in the targeted modification of complex eukaryotic genomes have unlocked a new era of genome engineering. From the pioneering work using zinc-finger nucleases (ZFNs), to the advent of the versatile and specific TALEN systems, and most recently the highly accessible CRISPR/Cas9 systems, we now possess an unprecedented ability to analyze developmental processes using sophisticated designer genetic tools. Excitingly, these robust and simple genomic engineering tools also promise to revolutionize developmental studies using less well established experimental organisms.
Modern developmental biology was born out of the fruitful marriage between traditional embryology and genetics. Genetic tools, together with advanced microscopy techniques, serve as the most fundamental means for developmental biologists to elucidate the logistics and the molecular control of growth, differentiation and morphogenesis. For this reason, model organisms with sophisticated and comprehensive genetic tools have been highly favored for developmental studies. Advances made in developmental biology using these genetically amenable models have been well recognized. The Nobel prize in Physiology or Medicine was awarded in 1995 to Edward B. Lewis, Christiane Nüsslein-Volhard and Eric F. Wieschaus for their discoveries on the ‘Genetic control of early structural development’ usingDrosophila melanogaster, and again in 2002 to John Sulston, Robert Horvitz and Sydney Brenner for their discoveries of ‘Genetic regulation of development and programmed cell death’ using the nematode worm Caenorhabditis elegans. These fly and worm systems remain powerful and popular models for invertebrate development studies, while zebrafish (Danio rerio), the dual frog species Xenopus laevis and Xenopus tropicalis, rat (Rattus norvegicus), and particularly mouse (Mus musculus) represent the most commonly used vertebrate model systems. To date, random or semi-random mutagenesis (‘forward genetic’) approaches have been extraordinarily successful at advancing the use of these model organisms in developmental studies. With the advent of reference genomic data, however, sequence-specific genomic engineering tools (‘reverse genetics’) enable targeted manipulation of the genome and thus allow previously untestable hypotheses of gene function to be addressed.
On average, every person carries mutations that inactivate at least one copy of 200 or so genes and both copies of around 20 genes. However, knockout mutations in any particular gene are rare, so very large populations are needed to study their effects. These ‘loss of function’ mutations have long been implicated in certain debilitating diseases, such as cystic fibrosis. Most, however, seem to be harmless — and some are even beneficial to the persons carrying them. “These are people we’re not going to find in a clinic, but they’re still really informative in biology,” says MacArthur.
The poster child for human-knockout efforts is a new class of drugs that block a gene known as PCSK9 (see Nature 496, 152–155; 2013). The gene was discovered in French families with extremely high cholesterol levels in the early 2000s. But researchers soon found that people with rare mutations that inactivate one copy of PCSK9 have low cholesterol and rarely develop heart disease. The first PCSK9-blocking drugs should hit pharmacies next year, with manufacturers jostling for a share of a market that could reach US$25 billion in five years.
“I think there are hundreds more stories like PCSK9 out there, maybe even thousands,” in which a drug can mimic an advantageous loss-of-function mutation, says Eric Topol, director of the Scripps Translational Science Institute in La Jolla, California. Mark Gerstein, a bioinformatician at Yale University in New Haven, Connecticut, predicts that human knockouts will be especially useful for identifying drugs that treat diseases of ageing. “You could imagine there’s a gene that is beneficial to you as a 25-year-old, but the thing is not doing a good job for you when you’re 75.”
The lungs of the planet are drying out, threatening to cause Earth to cough up some of its carbon reserves. The Amazon rainforest inhales massive amounts of carbon dioxide from the atmosphere, helping keep the globe’s carbon budget in balance (at least until human emissions started throwing that balance off). But as a new study shows, since 2000 drier conditions are causing a decrease in lung capacity. And if the Amazon’s breaths become more shallow, it’s possible a feedback loop could set in, further reducing lung capacity and throwing the carbon balance further out of whack.
The study, published in the Proceedings of the National Academy of Sciences on Monday, shows that a decline in precipitation has contributed to less healthy vegetation since 2000. “It’s well-established fact that a large part of Amazon is drying. We’ve been able to link that decline in precipitation to a decline in greenness over the last 10 years,” said Thomas Hilker, lead author of the study and forestry expert at Oregon State University.
Since 2000, rainfall has decreased by up to 25 percent across a vast swath of the southeastern Amazon, according to the new satellite analysis by Hilker. The cause of the decline in rainfall hasn’t been pinpointed, though deforestation and changes in atmospheric circulation are possible culprits.
The decrease mostly affected an area of tropical forest 12 times the size of California, as well as adjacent grasslands and other forest types. The browning of that area, which is in the southern Amazon, accounted for more than half the loss of greenness observed by satellites. While the decrease in greenness is comparatively small compared with the overall lushness of the rainforest, the impacts could be outsize.
That’s because the amount of carbon the Amazon stores is staggering. An estimated 120 billion tons of carbon are stashed in its plants and soil. Much of that carbon gets there via the forest flora that suck carbon dioxide out of the atmosphere. Worldwide, “it essentially takes up 25 percent of global carbon cycle that vegetation is responsible for,” Hilker said. “It’s a huge carbon stock.”
Researchers at Eindhoven University of Technology (TU/e) in the Netherlands and the University of Central Florida (CREOL), report in the journal Nature Photonics the successful transmission of a record high 255 Terabits/s over a new type of fiber allowing 21 times more bandwidth than currently available in communication networks. This new type of fiber could be an answer to mitigating the impending optical transmission capacity crunch caused by the increasing bandwidth demand.
Due to the popularity of Internet services and emerging network of capacity-hungry datacentres, demand for telecommunication bandwidth is expected to continue at an exponential rate. To transmit more information through current optical glass fibers, an option is to increase the power of the signals to overcome the losses inherent in the glass from which the fibre is manufactured. However, this produces unwanted photonic nonlinear effects, which limit the amount of information that can be recovered after transmission over the standard fiber.
The team at TU/e and CREOL, led by dr. Chigo Okonkwo, an assistant professor in the Electro-Optical Communications (ECO) research group at TU/e and dr. Rodrigo Amezcua Correa, a research assistant professor in Micro-structured fibers at CREOL, demonstrate the potential of a new class of fiber to increase transmission capacity and mitigate the impending 'capacity crunch' in their article that appeared yesterday in the online edition of the journal Nature Photonics.
The new fiber has seven different cores through which the light can travel, instead of one in current state-of-the-art fibers. This compares to going from a one-way road to a seven-lane highway. Also, they introduce two additional orthogonal dimensions for data transportation – as if three cars can drive on top of each other in the same lane. Combining those two methods, they achieve a gross transmission throughput of 255 Terabits/s over the fiber link. This is more than 20 times the current standard of 4-8 Terabits/s.
The story starts in 1784, when a geologist named John Michell was thinking deeply about Isaac Newton’s theory of gravity. In Newtonian physics, a cannonball can be shot into orbit around the Earth if it surpasses a particular speed, known as the planet’s escape velocity.
This speed depends on the mass and radius of the object you are trying to escape from. Michell’s insight was to imagine a body whose escape velocity was so great that it exceeded the speed of light – 300,000 kilometers per second – first measured in 1676 by the Danish astronomer Ole Romer.
Michell presented his results to other scientists, who speculated that massive “dark stars” might exist in abundance in the sky but be invisible because light can’t escape their surfaces. The French mathematician Pierre-Simon Laplace later made an independent discovery of these “dark stars” and both luminaries correctly calculated the very small radius – 6 kilometers – such an object would have if it were as massive as our sun.
After the revolutions of 20th century physics, black holes got much weirder. In 1916, a short while after Einstein published the complex equations underpinning General Relativity (which Einstein himself couldn’t entirely solve), a German astronomer named Karl Schwarzschild showed that a massive object squeezed to a single point would warp space around it so much that even light couldn’t escape. Though the cartoon version of black holes has them sucking everything up like a vacuum cleaner, light would only be unable to escape Schwarzschild’s object if it was inside a particular radius, called the Schwarzschild radius. Beyond this “event horizon,” you could safely leave the vicinity of a black hole.
Neither Schwarzschild nor Einstein believed this object was anything other than a mathematical curiosity. It took a much better understanding of the lives of stars before black holes were taken seriously. You see, a star only works because it preserves a delicate balance between gravity, which is constantly trying to pull its mass inward, and the nuclear furnace in its belly, which exerts pressure outward. At some point a star runs out of fuel and the fusion at its core turns off. Gravity is given the upper hand, causing the star to collapse. For stars like our sun, this collapse is halted when the electrons in the star’s atoms get so close that they generate a quantum mechanical force called electron degeneracy pressure. An object held up by this pressure is called a white dwarf.
In 1930, the Indian physicist Subrahmanyan Chandrasekhar showed that, given enough mass, a star’s gravity could overcome this electron degeneracy pressure, squeezing all its protons and electrons into neutrons. Though a neutron degeneracy pressure could then hold the weight up, forming a neutron star, the physicist Robert Oppenheimer found that an even more massive object could overcome this final outward pressure, allowing gravity to win and crushing everything down to a single point. Scientists slowly accepted that these things were real objects, not just weird mathematical solutions to the equations of General Relativity. In 1967, physicist John Wheeler used the term “black hole” to describe them in a public lecture, a name that has stuck ever since.
The bizarre behavior of the quantum world — with objects existing in two places simultaneously and light behaving as either waves or particles — could result from interactions between many 'parallel' everyday worlds, a new theory suggests.
“It is a fundamental shift from previous quantum interpretations,” says Howard Wiseman, a theoretical quantum physicist at Griffith University in Brisbane, Australia, who together with his colleagues describes the idea in Physical Review X1.
Theorists have tried to explain quantum behavior through various mathematical frameworks. One of the older interpretations envisages the classical world as stemming from the existence of many simultaneous quantum ones. But that ‘many worlds’ approach, pioneered by the US theorist Hugh Everett III in the 1950s, relies on the worlds branching out independently from one another, and not interacting at all (see 'Many worlds: See me here, see me there').
By contrast, Wiseman’s team envisages many worlds bumping into one another, calling it the 'many interacting worlds' approach. On its own, each world is ruled by classical Newtonian physics. But together, the interacting motion of these worlds gives rise to phenomena that physicists typically ascribe to the quantum world.
The authors work through the mathematics of how that interaction could produce quantum phenomena. For instance, one well-known example of quantum behaviour is when particles are able to tunnel through an energetic barrier that in a classical world they would not be able to overcome on their own. Wiseman says that, in his scenario, as two classical worlds approach an energetic barrier from either side, one of them will increase in speed while the other will bounce back. The leading world will thus pop through the seemingly insurmountable barrier, just as particles do in quantum tunneling.
But much work remains. “By no means have we answered all the questions that such a shift entails,” says Wiseman. Among other things, he and his collaborators have yet to overcome challenges such as explaining how their many-interacting-worlds theory could explain quantum entanglement, a phenomenon in which particles separated by a distance are still linked in terms of their properties.
Scientists at IBM and Repsol SA, Spain largest energy company, announced today (Oct. 30) the world’s first research collaboration using cognitive technologies like IBM’s Watson to jointly develop and apply new tools to make it cheaper and easier to find new oil fields.
An engineer will typically have to manually read through an enormous set of journal papers and baseline reports with models of reservoir, well, facilities, production, export, and seismic imaging data.
IBM says its cognitive technologies could help by analyzing hundreds of thousands of papers, prioritize data, and link that data to the specific decision at hand. It will introduce “new real-time factors to be considered, such as current news events around economic instability, political unrest, and natural disasters.”
The oil and gas industry boasts some of the most advanced geological, geophysical and chemical science in the world. But the challenge is to integrate critical geopolitical, economic, and other global news into decisions. And that will require a whole new approach to computing that can speed access to business insights, enhance strategic decision-making, and drive productivity, IBM says.
This goes beyond the capabilities of Watson. But scientists at IBM’s Cognitive Environments Laboratory (CEL), collaborating with Repsol, plan to develop and apply new prototype cognitive tools for real-world use cases in the oil and gas industry. They will experiment with a combination of traditional and new interfaces based upon spoken dialog, gesture, robotics and advanced visualization and navigation techniques.
The objective is build conceptual and geological models, highlight the impact of the potential risks and uncertainty, visualize trade-offs, and explore what-if scenarios to ensure the best decision is made, IBM says.
Repsol is making an initial investment of $15 million to $20 million to develop two applications targeted for next year, Repsol’s director for exploration and production technology Santiago Quesada explained to Bloomberg Business Week. “One app will be used for oil exploration and the other to help determine the most attractive oil and gas assets to buy.”
An endangered population of giant tortoises has recovered on the Galapagos island of Espanola.
Some 40 years after the first captive-bred tortoises were reintroduced to the island by the Galapagos National Park Service, the endemic Española giant tortoises are reproducing and restoring some of the ecological damage caused by feral goats that were brought to the island in the late 19th century.
"The global population was down to just 15 tortoises by the 1960s. Now there are some 1,000 tortoises breeding on their own. The population is secure. It's a rare example of how biologists and managers can collaborate to recover a species from the brink of extinction, " said James P. Gibbs, a professor of vertebrate conservation biology at the SUNY College of Environmental Science and Forestry (ESF) and lead author of the paper published in the journal PLOS ONE.
Gibbs and his collaborators assessed the tortoise population using 40 years of data from tortoises marked and recaptured repeatedly for measurement and monitoring by members of the Galapagos National Park Service, Charles Darwin Foundation, and visiting scientists.
But there is another side to the success story: while the tortoise population is stable, it is not likely to increase until more of the landscape recovers from the damage inflicted by the now-eradicated goats. After the goats devoured all the grassy vegetation and were subsequently removed from the island, more shrubs and small trees have grown on Española. This hinders both the growth of cactus, which is a vital piece of a tortoise's diet, and the tortoises' movement. Chemical analysis of the soil, done by Dr. Mark Teece, an ESF chemistry professor, shows there has been a pronounced shift from grasses to woody plants on the island in the last 100 years.
New research by physicists from Brown University puts the profound strangeness of quantum mechanics in a nutshell—or, more accurately, in a helium bubble.
Experiments led by Humphrey Maris, professor of physics at Brown, suggest that the quantum state of an electron—the electron's wave function—can be shattered into pieces and those pieces can be trapped in tiny bubbles of liquid helium. To be clear, the researchers are not saying that the electron can be broken apart. Electrons are elementary particles, indivisible and unbreakable. But what the researchers are saying is in some ways more bizarre.
In quantum mechanics, particles do not have a distinct position in space. Instead, they exist as a wave function, a probability distribution that includes all the possible locations where a particle might be found. Maris and his colleagues are suggesting that parts of that distribution can be separated and cordoned off from each other.
"We are trapping the chance of finding the electron, not pieces of the electron," Maris said. "It's a little like a lottery. When lottery tickets are sold, everyone who buys a ticket gets a piece of paper. So all these people are holding a chance and you can consider that the chances are spread all over the place. But there is only one prize—one electron—and where that prize will go is determined later."
If Maris's interpretation of his experimental findings is correct, it raises profound questions about the measurement process in quantum mechanics. In the traditional formulation of quantum mechanics, when a particle is measured—meaning it is found to be in one particular location—the wave function is said to collapse.
"The experiments we have performed indicate that the mere interaction of an electron with some larger physical system, such as a bath of liquid helium, does not constitute a measurement," Maris said. "The question then is: What does?"
Researchers believe they have confirmed the existence of a new type of chemical bond, first proposed some 30 years ago but never convincingly demonstrated because of the lack of experimental evidence and the relatively poor accuracy of the quantum chemistry methods that prevailed at the time.1 The new work also shows how substituting isotopes can result in fundamental changes in the nature of chemical bonding.
In the early 1980s it was proposed that in certain transition states consisting of a very light atom sandwiched between two heavy ones, the system would be stabilised not by conventional van der Waal’s forces, but by vibrational bonding, with the light atom shuttling between its two neighbours. However, despite several groups searching for such a system none was demonstrated and the hunt fizzled out.
Now, Jörn Manz, of the Free University of Berlin and Shanxi University in China, and colleagues believe they have the theoretical and experimental evidence to demonstrate a stable vibrational bond.
The researchers carried out a series of theoretical experiments looking at the reaction of BrH with Br to create the radical BrHBr, but using different isotopes of hydrogen. By using muons – elementary particles that are similar to an electron but have greater mass – the team added a range of hydrogen isotopes to BrHBr from the relatively hefty muonic helium, 4H, to the extremely light muonium, Mu, with a mass nearly 40 times smaller than 4H.
The team mapped two key parameters: the potential energy surface of the system – the three-dimensional potential energy ‘landscape’ relating the energy of the surface, with hills and valleys – to the geometry; and a quantum mechanical parameter, the vibrational zero point energy or ZPE.
Classically, a bond will form if there is a net reduction in the potential energy of the system. However, in certain circumstances, if there is a sufficiently large decrease in the vibrational ZPE, this can overcome the need for a decrease in potential energy and the system can be stabilised by a vibrational bond.
A multidisciplinary engineering team at the University of California, San Diego developed a new nanoparticle-based material for concentrating solar power plants designed to absorb and convert to heat more than 90 percent of the sunlight it captures. The new material can also withstand temperatures greater than 700 degrees Celsius and survive many years outdoors in spite of exposure to air and humidity. Their work, funded by the U.S. Department of Energy's SunShot program, was published recently in two separate articles in the journal Nano Energy.
By contrast, current solar absorber material functions at lower temperatures and needs to be overhauled almost every year for high temperature operations. "We wanted to create a material that absorbs sunlight that doesn't let any of it escape. We want the black hole of sunlight," said Sungho Jin, a professor in the department of Mechanical and Aerospace Engineering at UC San Diego Jacobs School of Engineering. Jin, along with professor Zhaowei Liu of the department of Electrical and Computer Engineering, and Mechanical Engineering professor Renkun Chen, developed the Silicon boride-coated nanoshell material. They are all experts in functional materials engineering.
The novel material features a "multiscale" surface created by using particles of many sizes ranging from 10 nanometers to 10 micrometers. The multiscale structures can trap and absorb light which contributes to the material's high efficiency when operated at higher temperatures.
Concentrating solar power (CSP) is an emerging alternative clean energy market that produces approximately 3.5 gigawatts worth of power at power plants around the globe—enough to power more than 2 million homes, with additional construction in progress to provide as much as 20 gigawatts of power in coming years. One of the technology's attractions is that it can be used to retrofit existing power plants that use coal or fossil fuels because it uses the same process to generate electricity from steam.
Google announced a new “Nanoparticle Platform” project Tuesday to develop medical diagnostic technology using nanoparticles, Andrew Conrad, head of the Google X Life Sciences team, disclosed at The Wall Street Journal’s WSJD Live conference. The idea is to use nanoparticles with magnetic cores circulating in the bloodstream with recognition molecules to detect cancer, plaques, or too much sodium, for example.
There are a number of similar research projects using magnetic (and other) nanoparticles in progress, as reported onKurzweilAI. What’s new in the Google project is delivering nanoparticles to the bloodstream via a pill and using a wearable wrist detector to detect the nanoparticles’ magnetic field and read out diagnostic results.
But this is an ambitious moonshot project. “Google is at least five to seven years away from a product approved for use by doctors,” said Sam Gambhir, chairman of radiology at Stanford University Medical School, who has been advising Dr. Conrad on the project for more than a year, the WSJ reports.
“Even if Google can make the system work, it wouldn’t immediately be clear how to interpret the results. That is why Dr. Conrad’s team started the Baseline study [see “New Google X Project to look for disease and health patterns in collected data”], which he hopes will create a benchmark for comparisons.”
Research by Hugh Byrd, Professor of Architecture at the University of Lincoln, UK, and Steve Matthewman, Associate Professor of Sociology at the University of Auckland, New Zealand, highlights the insecurities of power systems and weakening electrical infrastructure across the globe, particularly in built-up urban areas.
With a single prick and a single drop of blood, a San Diego company claims they can now detect if a patient has Ebola in less than 10 minutes. The breakthrough technology is called “Ebola Plus,” a tool that can be used to detect Ebola on anyone, anywhere in the world.
“We can do that for a large number of tests simultaneously with just one drop of blood,” said Dr. Cary Gunn, Ph.D. and CEO and Genalyte. Once blood is drawn, a silicon chip is used to detect the virus as blood flows over it.
Researchers at Genalyte have been working on the diagnostic tool for seven years, using it to test for various diseases, and only recently discovered it could also work to spot Ebola. “It allows you to screen more patients more rapidly. The biggest question right now is the debate about quarantine.
Instead of asking people to take their fever once or twice a day, they can just take a prick of blood,” said Dr. Gunn. It can analyze up to 100 samples per hour, and be administered anywhere including, hospitals, airports, and even remote areas in West Africa where the disease is spreading rapidly. “Right now, most people in Liberia aren’t even being tested. People who have suspicion of having Ebola are being checked into wards. The ability to take a prick of blood and do the test would be a game changer in that environment,” said Gunn.
Developing the platform for the test cost Genalyte around $100,000, but each chip that will be used during the tests costs $10 each – making early detection cheaper and easier for caretakers. Currently, the FDA has only approved for P-C-R that can take two hours for results, compared to the Ebola Plus that can provide results in ten minutes.
On islands off the coast of Florida, scientists uncover swift adaptive changes among Carolina anole populations, whose habitats were disturbed by the introduction of another lizard species.
For most of its existence, the Carolina anole (Anolis carolinensis) was the only lizard in the southwestern U.S. It could perch where it wanted, eat what it liked. But in the 1970s, aided by human pet trade, the brown anole (Anolis sagrei)—native to Cuba and the Bahamas—came marching in. In experiments on islands off the coast of Florida, scientists studying the effects of the species mixing witnessed evolution in action: the Carolina anole started perching higher up in trees, and its toe pads changed to enable better grip—all in a matter of 15 years, or about 20 lizard generations.
In a paper published in Science today (October 23),Yoel Stuart of the University of Texas at Austin, Todd Campbell from the University of Tampa, Florida, and their colleagues discuss what happened when the two species converged upon the same habitats.
In a petri dish in the bowels of Harvard Medical School scientists have tweaked three genes from the cells of an Asian elephant that help control the production of hemoglobin, the protein in blood that carries oxygen. Their goal is to make these genes more like those of an animal that last walked the planet thousands of years ago: the woolly mammoth.
Six Case Western Reserve scientists are part of an international team that has discovered two compounds that show promise in decreasing inflammation associated with diseases such as ulcerative colitis, arthritis and multiple sclerosis. The compounds, dubbed OD36 and OD38, specifically appear to curtail inflammation-triggering signals from RIPK2 (serine/threonine/tyrosine kinase 2). RIPK2 is an enzyme that activates high-energy molecules to prompt the immune system to respond with inflammation. The findings of this research appear in the Journal of Biological Chemistry.
Ferroelectric materials – commonly used in transit cards, gas grill igniters, video game memory and more – could become strong candidates for use in next-generation computers, thanks to new research led by scientists at the University of California, Berkeley, and the University of Pennsylvania.
The researchers found an easy way to improve the performance of ferroelectric materials in a way that makes them viable candidates for low-power computing and electronics. They described their work in a study published today (Sunday, Oct. 26) in the journal Nature Materials.
Ferroelectric materials have spontaneous polarization as a result of small shifts of negative and positive charges within the material. A key characteristic of these materials is that the polarization can be reversed in response to an electric field, enabling the creation of a “0” or “1” data bit for memory applications. Ferroelectrics can also produce an electric charge in response to physical force, such as being pressed, squeezed or stretched, which is why they are found in applications such as push-button igniters on portable gas grills.
“What we discovered was a fundamentally new and unexpected way for these ferroelectric materials to respond to applied electric fields,” said study principal investigator Lane Martin, UC Berkeley associate professor of materials science and engineering. “Our discovery opens up the possibility for faster switching and new control over novel, never-before-expected multi-state devices.”
Martin and other UC Berkeley researchers partnered with a team led by Andrew Rappe, University of Pennsylvania professor of chemistry and of materials science and engineering. UC Berkeley graduate student Ruijuan Xu led the study’s experimental design, and Penn graduate student Shi Liu led the study’s theoretical modeling.
Scientists have turned to ferroelectrics as an alternative form of data storage and memory because the material holds a number of advantages over conventional semiconductors. For example, anyone who has ever lost unsaved computer data after power is unexpectedly interrupted knows that today’s transistors need electricity to maintain their “on” or “off” state in an electronic circuit.
Because ferroelectrics are non-volatile, they can remain in one polarized state or another without power. This ability of ferroelectric materials to store memory without continuous power makes them useful for transit cards, such as the Clipper cards used to pay fare in the Bay Area, and in certain memory cards for consumer electronics. If used in next-generation computers, ferroelectrics would enable the retention of information so that data would be there if electricity goes out and then is restored.
“If we could integrate these materials into the next generation of computers, people wouldn’t lose their data if the power goes off,” said Martin, who is also a faculty scientist at the Lawrence Berkeley National Laboratory. “For an individual, losing unsaved work is an inconvenience, but for large companies like eBay, Google and Amazon, losing data is a significant loss of revenue.”
So what has held ferroelectrics back from wider use as on/off switches in integrated circuits? The answer is speed, according to the study authors.