Amazing Science
722.0K views | +100 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Scooped by Dr. Stefan Gruenwald!

Robotic underwater gliders reveal why the Antarctic ice sheet is melting 150 billion tons per year

Robotic underwater gliders reveal why the Antarctic ice sheet is melting 150 billion tons per year | Amazing Science |

At current rates, ice sheet loss will become the most significant  contributor to global sea level rise during this century, yet there is still a lot that scientists  don't know about the underlying causes. This is partly because Antarctica is such a difficult place to take measurements.

But now robotic underwater gliders are giving scientists new insight into why the Antarctic ice sheet is melting. An ice sheet is a huge layer of ice that sits on land. The two on the Earth today are found on Antarctica and Greenland, but in the last ice age there were also ice sheets on North America and northern Europe.

The Antarctic ice sheet spans more than 14 million square kilometers, which is roughly the same size as the US and Mexico put together. The ice sheet also spills out onto the surrounding ocean in the form of ice shelves.

The Intergovernmental Panel on Climate Change (IPCC)  estimates that the Antarctic ice sheet is currently losing around 150 billion tonnes of ice per year. One of the main areas of ice loss is from the Antarctic Peninsula, shown in the red rectangle in the map below.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Toxic algae blooms cause illness and death in dogs

Toxic algae blooms cause illness and death in dogs | Amazing Science |

If after playing in still water, a dog starts vomiting, has tremors and becomes lethargic, the pet may have been poisoned by toxic algae. In severe cases, dogs can show symptoms within minutes and die within an hour of exposure. As a result, Cornell experts recommend keeping dogs on leashes around potentially algae-ridden water and preventing them from ingesting toxic scum off the water, the beach or themselves.

Over the last several years, reports of pets falling ill or even dying after swimming have skyrocketed, said David MacNeill, a Great Lakes fisheries and ecosystem specialist at New York Sea Grant, which is part of Cornell Cooperative Extension.

The reasons behind this uptick in reports remain unclear. Theories include increased harmful algae blooms from fertilizer and other nutrient runoff that promotes algae growth; in some systems, zebra mussels regenerate phosphate nutrients close to shore, which also spurs algae growth. Warm temperatures also promote growth. And more awareness among dog owners and veterinarians of the signs of algae toxicity may have led to more reported cases.

"A lot of cases go unreported because [algae bloom poisoning] can be difficult to diagnose," said MacNeill. "Fortunately, monitoring of our waters for algal toxins is improving, and now that diagnostic capabilities have improved, we are seeing more confirmed cases."

While harmful algae blooms can sicken people, the only recorded human deaths occurred in Brazil when contaminated water was used for dialysis. But dogs are especially vulnerable due to their behaviors of lapping up pond or lake water, licking wet fur and eating algae debris on beaches, as they are attracted to the smell.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

To avoid dangerous climate change, fossil fuels should be phased out by 2100, IPCC study finds

To avoid dangerous climate change, fossil fuels should be phased out by 2100, IPCC study finds | Amazing Science |

The unrestricted use of fossil fuels should be phased out by 2100 if the world is to avoid dangerous climate change, a UN-backed expert panel says. The Intergovernmental Panel on Climate Change says in a stark report that most of the world's electricity can - and must - be produced from low-carbon sources by 2050. If not, the world faces "severe, pervasive and irreversible" damage.

The UN said inaction would cost "much more" than taking the necessary action. The IPCC's Synthesis Report was published on Sunday in Copenhagen, after a week of intense debate between scientists and government officials.

It is intended to inform politicians engaged in attempts to deliver a new global treaty on climate by the end of 2015. The report says that reducing emissions is crucial if global warming is to be limited to 2C - a target acknowledged in 2009 as the threshold of dangerous climate change.

The report suggests renewables will have to grow from their current 30% share to 80% of the power sector by 2050. In the longer term, the report states that fossil fuel power generation without carbon capture and storage (CCS) technology would need to be "phased out almost entirely by 2100". 

The Synthesis Report summarises three previous reports from the IPCC, which outlined the causesthe impacts and the potential solutions to climate change.

It re-states many familiar positions:

  • Warming is "unequivocal" and the human influence on climate is clear
  • The period from 1983 to 2012, it says, was likely the warmest 30 year period of the last 1,400 years
  • Warming impacts are already being seen around the globe, in the acidification of the oceans, the melting of arctic ice and poorer crop yields in many parts
  • Without concerted action on carbon, temperatures will increase over the coming decades and could be almost 5C above pre-industrial levels by the end of this century
No comment yet.
Scooped by Dr. Stefan Gruenwald!

IBM Watson uses cognitive technologies to help finding new sources of oil

IBM Watson uses cognitive technologies to help finding new sources of oil | Amazing Science |

Scientists at IBM and Repsol SA, Spain largest energy company, announced today (Oct. 30) the world’s first research collaboration using cognitive technologies like IBM’s Watson to jointly develop and apply new tools to make it cheaper and easier to find new oil fields.

An engineer will typically have to manually read through an enormous set of journal papers and baseline reports with models of reservoir, well, facilities, production, export, and seismic imaging data.

IBM says its cognitive technologies could help by analyzing hundreds of thousands of papers, prioritize data, and link that data to the specific decision at hand. It will introduce “new real-time factors to be considered, such as current news events around economic instability, political unrest, and natural disasters.”

The oil and gas industry boasts some of the most advanced geological, geophysical and chemical science in the world. But the challenge is to integrate critical geopolitical, economic, and other global news into decisions. And that will require a whole new approach to computing that can speed access to business insights, enhance strategic decision-making, and drive productivity, IBM says.

This goes beyond the capabilities of Watson. But scientists at IBM’s Cognitive Environments Laboratory (CEL), collaborating with Repsol, plan to develop and apply new prototype cognitive tools for real-world use cases in the oil and gas industry. They will experiment with a combination of traditional and new interfaces based upon spoken dialog, gesture, robotics and advanced visualization and navigation techniques.

The objective is build conceptual and geological models, highlight the impact of the potential risks and uncertainty, visualize trade-offs, and explore what-if scenarios to ensure the best decision is made, IBM says.

Repsol is making an initial investment of $15 million to $20 million to develop two applications targeted for next year, Repsol’s director for exploration and production technology Santiago Quesada explained to Bloomberg Business Week. “One app will be used for oil exploration and the other to help determine the most attractive oil and gas assets to buy.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Drying Amazon Could Be Major Carbon Concern Going Forward

Drying Amazon Could Be Major Carbon Concern Going Forward | Amazing Science |

The lungs of the planet are drying out, threatening to cause Earth to cough up some of its carbon reserves. The Amazon rainforest inhales massive amounts of carbon dioxide from the atmosphere, helping keep the globe’s carbon budget in balance (at least until human emissions started throwing that balance off). But as a new study shows, since 2000 drier conditions are causing a decrease in lung capacity. And if the Amazon’s breaths become more shallow, it’s possible a feedback loop could set in, further reducing lung capacity and throwing the carbon balance further out of whack.

The study, published in the Proceedings of the National Academy of Sciences on Monday, shows that a decline in precipitation has contributed to less healthy vegetation since 2000. “It’s well-established fact that a large part of Amazon is drying. We’ve been able to link that decline in precipitation to a decline in greenness over the last 10 years,” said Thomas Hilker, lead author of the study and forestry expert at Oregon State University.

Since 2000, rainfall has decreased by up to 25 percent across a vast swath of the southeastern Amazon, according to the new satellite analysis by Hilker. The cause of the decline in rainfall hasn’t been pinpointed, though deforestation and changes in atmospheric circulation are possible culprits.

The decrease mostly affected an area of tropical forest 12 times the size of California, as well as adjacent grasslands and other forest types. The browning of that area, which is in the southern Amazon, accounted for more than half the loss of greenness observed by satellites. While the decrease in greenness is comparatively small compared with the overall lushness of the rainforest, the impacts could be outsize.

That’s because the amount of carbon the Amazon stores is staggering. An estimated 120 billion tons of carbon are stashed in its plants and soil. Much of that carbon gets there via the forest flora that suck carbon dioxide out of the atmosphere. Worldwide, “it essentially takes up 25 percent of global carbon cycle that vegetation is responsible for,” Hilker said. “It’s a huge carbon stock.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New 'smart' material improves removal of arsenic from drinking water

New 'smart' material improves removal of arsenic from drinking water | Amazing Science |

Scientists have created a new material that can remove double the amount of arsenic from water than the leading material for water treatment. Arsenic is a toxic element found naturally in groundwater. Long-term exposure over a number of years to elevated concentrations of arsenate, the chemical form of arsenic in water, is associated with debilitating, and potentially fatal, illnesses including cancer, heart and lung disease, gastrointestinal problems and neurological disorders.

Arsenic-contaminated drinking water has been identified in many countries across the globe, including Bangladesh, Chile, Mexico, Argentina, Australia, USA and parts of the UK. Recent estimates suggest that more than 200 million people are unknowingly exposed to unsafe levels of arsenic in their drinking water.  

In a new study published inChemistry - A European Journal, scientists at Imperial College London have designed, tested and patented a new zinc-based material that can selectively bind to arsenate with strong affinity. The scientists hope this material could ultimately be used to improve quality of domestic water filters and reduce the amount of arsenic that people are exposed to, in areas with known or suspected high arsenic content.

In 2006 the World Health Organization issued guidelines defining safe concentration levels of arsenic as 10 parts per billion but several countries affected by arsenic-contaminated groundwater have legal concentration limits above this guideline and recent evidence suggests that long-term exposure to smaller concentrations can be harmful.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Algae deliver hydrogen at a 5 times higher efficiency

Algae deliver hydrogen at a 5 times higher efficiency | Amazing Science |
Hydrogen as a regenerative fuel produced in gigantic water tanks full of algae, which need nothing more than sunlight to produce the promising green energy carrier: a great idea in theory, but one that fails due to the vast amount of space required for the production process. Scientists from the Max Planck Institutes for Chemical Energy Conversion and Coal Research) in Mülheim an der Ruhr, and from the research group Photobiotechnology at Ruhr-Universität Bochum (RUB) have now discovered a way of increasing the efficiency of hydrogen production in microalgae by a factor of five. If the algae can generate the fuel more efficiently, it can be produced in a smaller area and in quantities suitable for practical use. This approach also dispenses with the need for rare and expensive precious metals, which are used to split the energy-rich gas is technically from water.

Living organisms need electrons in many places, as they use them to form chemical compounds. Algae and other organisms which carry out photosynthesis release electrons from water with the help of sunlight and then distribute them in the cell. The ferrous protein PETF is responsible for this: It transports the electrons in particular to ferredoxin-NADP+ oxidoreductase (FNR), so that NADPH is formed and carbohydrates are finally synthesised from carbon dioxide. The production of hydrogen through hydrogenases is among the many other processes, for which PETF provides the necessary electrons.

Hydrogenases are very efficient enzymes that contain a unique active centre comprising six iron atoms, where the electrons supplied by PETF are bound to protons. Molecular hydrogen is produced in this way.

With the help of nuclear magnetic resonance spectroscopy, on which magnetic resonance imaging in medicine is also based, the scientists working with Sigrun Rumpel, a post doc at the Max Planck Institute for Chemical Energy Conversion in Mülheim, investigated the components of PETF – or more precisely amino acids – that interact with the hydrogenase and those that interact with FNR. It emerged that only two amino acids of PETF are important for binding FNR. When the researchers modified these two amino acids and the enzyme FNR, PETF was no longer able to bind FNR as efficiently. Thus, the enzyme transferred less electrons to FNR and more to the hydrogenase. In this way, the scientists increased the hydrogen production by a factor of five.

“For a technically feasible hydrogen production with the help of algae, its efficiency must be increased by a factor of 10 to 100 compared to the natural process,” says Sigrun Rumpel. “Through the targeted modification of PETF and FNR we have taken a step towards achieving this objective.” Up to now, the production of hydrogen from renewable energy carriers involved the electrolytic splitting of water. Expensive and rare precious metals like platinum are currently required for this purpose. Sigrun Rumpel and other researchers are therefore working on finding a way of enabling algae to efficiently produce the fuel. Microalgae produce the gas naturally, but in very small volumes. Thus, if cars were to be powered one day using hydrogen rather than petrol or diesel, to come anywhere near covering Germany’s fuel requirements, gigantic areas with tanks full of algal cultures would have to be set up.

“These results represent a path to the economically-viable regenerative production of fuels with the help of microalgae,” says Sigrun Rumpel. The change of electron transfer pathways could further improve hydrogen production in future. The researchers therefore now want to combine different modifications with each other.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Oceans experiencing largest sea level rise in 6,000 years, study says

Oceans experiencing largest sea level rise in 6,000 years, study says | Amazing Science |

There are two main forces that can drive sea levels higher. One is something called thermal expansion, which involves the expansion of ocean water as it warms. The other is an influx of additional water, ushered into the sea by melting ice sheets and glaciers. Scientists have long concluded that sea levels are rising. Just look at Miami. Or the Maldives. They’ve also discerned that major ice sheets are melting at a faster clip than previously understood.

What has been less clear, however, is whether the development is recent or not. Over the last several thousands of years, has the ocean risen and fallen and risen again? A new study, just published in PNAS, suggests that the ocean has been surprisingly static since 4,000 B.C..

But that changed 150 years ago. Reconstructing 35,000 years of sea fluctuations, the study, which researchers say is the most comprehensive of its kind, found that the oceans are experiencing greater sea rise than at any time over the last 6,000 years. “What we see in the tide gauges, we don’t see in the past record, so there’s something going on today that’s wasn’t going on before,” lead author Kurt Lambeck, a professor at Australian National University, told the Australia Broadcasting Corporation. “I think that is clearly the impact of rising temperatures.”

How much has the sea risen over the past century and a half? A lot. And it’s surging faster than ever. “There is robust evidence that sea levels have risen as a result of climate change,” Australian government research has found. “Over the last century, global average sea level rose by 1.7 mm [0.067 inches] per year, in recent years (between 1993 and 2010), this rate has increased to 3.2 mm [0.126 inches] per year.” In all, the sea has risen roughly 20 centimeters since the start of the 20th century. “The rate of sea level rise over the last century is unusually high in the context of the last 2,000 years,” the Australian report added.

But it’s not just the last 2,000 years. It’s the last 6,000 years, according to this recent study. And now, the rising sea levels over the last 100 years, is “beyond dispute,” Lambeck explained.

NERONYC's curator insight, October 19, 2014 6:04 PM

Preserve our invironment

Scooped by Dr. Stefan Gruenwald!

Icebergs once drifted to Florida, new climate model suggests

Icebergs once drifted to Florida, new climate model suggests | Amazing Science |

The first study to show that when the large ice sheet over North America known as the Laurentide ice sheet began to melt, icebergs calved into the sea around Hudson Bay and would have periodically drifted along the east coast as far south as Miami.

 IMAGE: This is a map showing the pathway taken by icebergs from Hudson Bay, Canada, to Florida. The blue colors (behind the arrows) are an actual snapshots.

Click here for more information.

Using a first-of-its-kind, high-resolution numerical model to describe ocean circulation during the last ice age about 21,000 year ago, oceanographer Alan Condron of the University of Massachusetts Amherst has shown that icebergs and meltwater from the North American ice sheet would have regularly reached South Carolina and even southern Florida. The models are supported by the discovery of iceberg scour marks on the sea floor along the entire continental shelf.

Such a view of past meltwater and iceberg movement implies that the mechanisms of abrupt climate change are more complex than previously thought, Condron says. "Our study is the first to show that when the large ice sheet over North America known as the Laurentide ice sheet began to melt, icebergs calved into the sea around Hudson Bay and would have periodically drifted along the east coast of the United States as far south as Miami and the Bahamas in the Caribbean, a distance of more than 3,100 miles, about 5,000 kilometers."

His work, conducted with Jenna Hill of Coastal Carolina University, is described in the current advance online issue of Nature Geosciences. "Determining how far south of the subpolar gyre icebergs and meltwater penetrated is vital for understanding the sensitivity of North Atlantic Deep Water formation and climate to past changes in high-latitude freshwater runoff," the authors say.

Hill analyzed high-resolution images of the sea floor from Cape Hatteras to Florida and identified about 400 scour marks on the seabed that were formed by enormous icebergs plowing through mud on the sea floor. These characteristic grooves and pits were formed as icebergs moved into shallower water and their keels bumped and scraped along the ocean floor.

"The depth of the scours tells us that icebergs drifting to southern Florida were at least 1,000 feet, or 300 meters thick," says Condron. "This is enormous. Such icebergs are only found off the coast of Greenland today."

To investigate how icebergs might have drifted as far south as Florida, Condron simulated the release of a series of glacial meltwater floods in his high-resolution ocean circulation model at four different levels for two locations, Hudson Bay and the Gulf of St. Lawrence.

Condron reports, "In order for icebergs to drift to Florida, our glacial ocean circulation model tells us that enormous volumes of meltwater, similar to a catastrophic glacial lake outburst flood, must have been discharging into the ocean from the Laurentide ice sheet, from either Hudson Bay or the Gulf of St. Lawrence."

Further, during these large meltwater flood events, the surface ocean current off the coast of Florida would have undergone a complete, 180-degree flip in direction, so that the warm, northward flowing Gulf Stream would have been replaced by a cold, southward flowing current, he adds.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Earth lost 50% of its wildlife in the past 40 years

Earth lost 50% of its wildlife in the past 40 years | Amazing Science |

The number of wild animals on Earth has halved in the past 40 years, according to a new analysis. Creatures across land, rivers and the seas are being decimated as humans kill them for food in unsustainable numbers, while polluting or destroying their habitats, the research by scientists at WWF and the Zoological Society of London found.

“If half the animals died in London zoo next week it would be front page news,” said Professor Ken Norris, ZSL’s director of science. “But that is happening in the great outdoors. This damage is not inevitable but a consequence of the way we choose to live.” He said nature, which provides food and clean water and air, was essential for human wellbeing.

“We have lost one half of the animal population and knowing this is driven by human consumption, this is clearly a call to arms and we must act now,” said Mike Barratt, director of science and policy at WWF. He said more of the Earth must be protected from development and deforestation, while food and energy had to be produced sustainably.

The steep decline of animal, fish and bird numbers was calculated by analyzing 10,000 different populations, covering 3,000 species in total. This data was then, for the first time, used to create a representative “Living Planet Index” (LPI), reflecting the state of all 45,000 known vertebrates.

“We have all heard of the FTSE 100 index, but we have missed the ultimate indicator, the falling trend of species and ecosystems in the world,” said Professor Jonathan Baillie, ZSL’s director of conservation. “If we get [our response] right, we will have a safe and sustainable way of life for the future,” he said.

If not, he added, the overuse of resources would ultimately lead to conflicts. He said the LPI was an extremely robust indicator and had been adopted by UN’s internationally-agreed Convention on Biological Diversity as key insight into biodiversity.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Boundary Dam Power Plant: Let the Clean Coal Era Begin

Boundary Dam Power Plant: Let the Clean Coal Era Begin | Amazing Science |

On October 2, the Boundary Dam power plant in Saskatchewan became the first full-sized coal-fired boiler tocapture the copious carbon dioxide that had previously billowed from its smokestack, preventing the greenhouse gas from entering the atmosphere. On the resulting invisible stream of hot smoke ride the hopes of combating climate change while still burning fossil fuels.

Such CO2 capture and storage (CCS) “is the only known technology that will enable us to continue to use fossil fuels and also de-carbonize the energy sector,” said Maria van der Hoeven, executive director of the International Energy Agency, in a statement on the opening of the Boundary Dam plant. “As fossil fuel consumption is expected to continue for decades, deployment of CCS is essential.”

That deployment is beginning to happen in fits and starts, and with lots of government support. For example, the Mississippi-based Kemper Facility, a power plant that will turn brown coal to gas and strip off the CO2 in the process, is due online in 2015—a year behind schedule and at a of cost $5.6 billion, more than twice its initial estimate. And the U.S. Environmental Protection Agency has approved plans by Archer Daniels Midland (ADM) to inject CO2 captured at its ethanol fermentation facility in Illinois into a saltwater aquifer deep underground.

The Boundary Dam also burns brown coal, the most polluting form of the most polluting fossil fuel. Saskatchewan has an estimated 300-year supply of the dirty fuel to burn at present rates of consumption. The unit uses amines—a nitrogen-based molecule that can bond with CO2—to capture a projected 1 million metric tons of the leading greenhouse gas each year. The amine captures the CO2 and then when further heated releases it again, meaning it takes away some of the plant’s power to take away the plant’s CO2. The captured CO2, compressed and liquefied, will then travel 66 kilometers via pipeline to the nearby Weyburn oil fields and join the CO2 captured at a plant that turns brown coal into a gas in North Dakota. At Weyburn, the CO2 will be used to scour more oil out of the ground. Of course, the eventual burning of this oil will also release CO2 into the atmosphere.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Southern Hemisphere Analysis Reveals That Global Warming Is Underestimated By 24% To 58%

Southern Hemisphere Analysis Reveals That Global Warming Is Underestimated By 24% To 58% | Amazing Science |

Oceanographers from Lawrence Livermore National Laboratory have discovered the heat content change of the Earth has been severely underestimated.  New calculations suggest that the amount of heat added to the Earth in the last 35 years is 24% to 58% higher than thought, due to poor sampling of ocean temperatures in the Southern Hemisphere.  The results have global implications as the ocean absorbs over 90% of the heat due to trapping by greenhouse gases.  The implications are that there has been greater net inflow of energy from the Sun, and greater amounts being stored in the world’s oceans.

The scientists obtained satellite measurements of sea surface heights, and combined it with ocean temperature data collected between 1970 and 2004, a 35 year period.  The reason for using sea surface height is that the volume change of the ocean is intimately linked to temperature: water expands as it is being warmed and additional water is added by increased melt from land ice.

Climate models show that the changes in sea surface height of the two hemispheres were consistent with the satellite measurements.  However, simulated temperature changes in the shallow layers (down to 700 meters) are not consistent with collected data before 2004.  This is believed to be due entirely to the sparsity of measurements of the Souther Hemisphere before new technology was installed to increase accuracy of the data.  Later data shows consistency between model and satellite observations.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Soil ecologists confirm: Manhattan's Central Park is home to 170,000 different kinds of microbes

Soil ecologists confirm: Manhattan's Central Park is home to 170,000 different kinds of microbes | Amazing Science |
The urban oasis boasts about 170,000 different types of microbes, recent dirt samples show. That diversity is comparable to a tropical rain forest. About 2,000 species are found only in the park.

Manhattan's Central Park is surrounded by one of the densest cities on the planet. It's green enough, yet hardly the first place most people would think of as biologically rich.  But a team of scientists got a big surprise when they recently started digging there. They were 10 soil ecologists — Kelly Ramirez from Colorado State University was among them. "We met on the steps of the natural history museum at 7 a.m. with our collection gear, coolers and sunblock," she recalls.

Their goal: to collect about 600 soil samples from across the park and look for microbes. Why? Because Ramirez was the head of something called The Global Soil Biodiversity Initiative.

Given her love of dirt, Ramirez was the right person for the job. "I think soil biodiversity is like the stars beneath our feet," she says. "There is so much going on in the soil — it's just a hot spot, teeming with so many different types of organisms."

Microbes are architects of soil. They alter its chemistry, even its shape. And in terms of its microbes, Central Park was terra incognita. So the team fanned out and dug. Onlookers were — well, blasé. This was New York City. "I think because they're used to weird things going on in the park," says Ramirez, "it just probably looked sort of normal that we were collecting."

And what the team found turned out to be very surprising — almost 170,000 different kinds of microbes. They didn't expect an urban park to measure up to the wild places they'd sampled around the world. "There's as much biodiversity in the soils of Central Park as we found in the soil ... from the Arctic to Antarctica," says Ramirez, who's now at the Netherlands Institute of Ecology. She's including places like temperate forests, tropical forests and deserts. The species numbers are an average of all those places — some are a bit more or less diverse than Central Park. The team also found 2,000 species of microbes that are apparently unique to Central Park.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Strategies for containing Ebola in West Africa

Strategies for containing Ebola in West Africa | Amazing Science |

A multinational Ebola outbreak of unprecedented magnitude was declared a Public Health Emergency of International Concern by the World Health Organization (WHO) on August 8, 2014 (1). From Guinea, the outbreak has spread to the neighboring nations of Liberia and Sierra Leone, subsequently expanding into Nigeria and Senegal (2). Imported Ebola cases have recently led to transmission in the US and Spain (2). As of 15 October, over 9,000 cases and 4,000 fatalities have been reported, with the majority of both occurring in Liberia (2).

The ongoing Ebola outbreak poses an alarming risk to the countries of West Africa and beyond. To assess the effectiveness of containment strategies, a team of scientists has now developed a stochastic model of Ebola transmission between and within the general community, hospitals, and funerals, calibrated to incidence data from Liberia. They find that a combined approach of case isolation, contact tracing with quarantine and sanitary funeral practices must be implemented with utmost urgency in order to reverse the growth of the outbreak. Under status quo intervention, their projections indicate that the Ebola outbreak will continue to spread, generating a predicted 224 (95% CI: 134 – 358) cases daily in Liberia alone by December, highlighting the need for swift application of multifaceted control interventions.

CheunSeungHwan's curator insight, November 6, 2014 8:45 PM

Ebola is not a only problem of Africa. The west should support the medical team more also pharmaceutical company.

Scooped by Dr. Stefan Gruenwald!

Air pollution halves India's grain yield

Air pollution halves India's grain yield | Amazing Science |

Air pollution seems to have a direct and negative impact on grain production in India, a US study warned on Monday, with recent increases in smog decreasing projected yields by half.

Analyzing 30 years of data, scientists developed a statistical model suggesting that air pollution caused wheat yields in densely populated states to be 50% lower than what they could have been in 2010.Up to 90% of the decrease in potential food production seems to be linked to smog, a mix of black carbon and other pollutants, the study said. Changes linked to global warming and precipitation levels accounted for the other 10%.

"The numbers are staggering," Jennifer Burney, an author of the study and scientist at the University of California told the Thomson Reuters Foundation.

"We hope our study puts the potential benefits on cleaning up the air on the table," she said, noting that agriculture is often not considered when governments debate the economic costs of air pollution and new legislation aimed at combating it.

The research paper 'Recent climate and air pollution impacts on Indian agriculture', published in the Proceedings of National Academy of Sciences, analysed what could have been the wheat production if there was less pollution.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Global groundwater crisis may get worse as the world warms

Global groundwater crisis may get worse as the world warms | Amazing Science |

The world is facing an increasingly dire groundwater depletion crisis, according to a NASA researcher. From India to Texas, people are rapidly depleting their valuable stores of groundwater — leading to the possibility that aquifers may be emptied within decades, a NASA researcher has warned.

In a recent commentary in the journal Nature Climate Change, Jay Famiglietti, who has helped lead the use of a NASA satellite system to detect groundwater changes around the world, warned of dramatic consequences to come if changes are not made to the way that societies manage water supplies. “Our overuse of groundwater puts our overall water security at far greater risk than we thought,” Famiglietti says.

Unlike surface water, which is replenished through precipitation, groundwater can take centuries to recharge. Yet humans are depleting groundwater at rates that far exceed the pace at which this water can be replenished.

Think of it this way: groundwater is analogous to a pension, a long-term investment that takes many years to pay off. If you withdraw more than you put in, you'll go bankrupt in the long run. Dams and reservoirs, meanwhile, are more like a checking account.

"Groundwater is being pumped at far greater rates than it can be naturally replenished, so that many of the largest aquifers on most continents are being mined, their precious contents never to be returned," Famiglietti, a researcher at NASA's Jet Propulsion Laboratory in California, wrote.

Famiglietti has used NASA’s Gravity Recovery and Climate Experiment (GRACE) satellite system, which is capable of detecting the most subtle changes in Earth's gravitational field to spot land elevation changes, and thus water depletion, to publish a number of studies on groundwater in recent years. During the summer, for example, he contributed to a study that revealed that water users throughout the Colorado River Basin are tapping into groundwater supplies to make up for the lack of adequate supplies of surface water.

The study found that more than 75% of the water loss in the Colorado River Basin since 2004 came from groundwater. GRACE showed that between December 2004 and November 2013, the Colorado River basin lost nearly 53 million acre feet of freshwater, which is double the total volume of the country’s largest reservoir — Lake Mead in Arizona. More than three-quarters of the total — about 41 million acre feet — was from groundwater.

Katelyn Sesny's curator insight, October 31, 2014 11:41 AM

A lengthy but interesting article. The issue of the "Global Groundwater Crisis" might become a very huge problem in the near future. - UNIT 1

Scooped by Dr. Stefan Gruenwald!

Lights out: study shows urgent need to address instability of world's power grid

Lights out: study shows urgent need to address instability of world's power grid | Amazing Science |

Research by Hugh Byrd, Professor of Architecture at the University of Lincoln, UK, and Steve Matthewman, Associate Professor of Sociology at the University of Auckland, New Zealand, highlights the insecurities of power systems and weakening electrical infrastructure across the globe, particularly in built-up urban areas.

The work builds on previous studies which examined a sharp increase in electrical usage over recent years, and warned the world to prepare for the prospect of coping without electricity as instances of complete power failure become increasingly common.

Professor Byrd explained: “We have previously highlighted that demand for new technology continues to grow at an unprecedented rate. Our new research emphasizes why energy sources are becoming increasingly inadequate, and simply cannot continue to meet this demand.

“Throughout our study, we observed a number of network failures due to inadequate energy, whether through depletion of resources such as oil and coal, or through the vagaries of the climate in the creation of renewable energy.”

The British energy regulator Ofgem has predicted a fall in spare electrical power production capacity to two per cent by 2015, meaning there is now even less flexibility of supply to adjust to spikes in demand. 

The issue of energy security exists for countries which have access to significant renewable power supplies too. With rain, wind and sunshine becoming less predictable due to changes brought about by global warming, the new research found that severe blackouts in Kenya, India, Tanzania and Venezuela, which all occurred during the last decade, were caused by shortages of rain in hyrdro-dams.

Further to the irregularities involved in renewable power generation, the study concludes that worldwide electricity supply will also become increasingly precarious due to industry privatization and neglect of infrastructure.

Professor Matthewman said: “Over the past two decades, deregulation and privatization have become major global trends within the electrical power industry. In a competitive environment, reliability and profits may be at cross-purposes — single corporations can put their own interests ahead of the shared grid, and spare capacity is reduced in the name of cost saving. There is broad consensus among energy specialists, national advisory bodies, the reinsurance industry, and organizational sociologists that this has exacerbated blackout risk.”

These trends have seen the separation of power generation, transmission and distribution services – a process which Professors Byrd and Matthewman suggest only opens up more opportunity for electrical disruption. Their study reveals the difficulties that arise when different technical and human systems need to communicate, and points to a breakdown in this type of communication as the main  cause behind two of the world’s worst ever blackouts – from Ohio, USA, to Ontario, Canada, in 2003; and across Italy and neighboring nations in the same year. Together, these power failures affected more than 100 million people.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New study strengthens link between Arctic sea-ice loss and extreme winters

New study strengthens link between Arctic sea-ice loss and extreme winters | Amazing Science |

Declining Arctic sea-ice has made severe winters across central Asia twice as likely, new research shows. The paper is the latest in a series linking very cold winters in the northern hemisphere to rapidly increasing temperatures in the Arctic. But the long-term picture suggests these cold winters might only be a temporary feature before further warming takes hold.

Temperatures in the Arctic are increasing almost twice as fast as the global average. This is known as  Arctic amplification. As Arctic sea-ice shrinks, energy from the sun that would have been reflected away by sea-ice is instead absorbed by the ocean.

Arctic amplification has been linked with very cold winters in mid-latitude regions of the northern hemisphere. The UK, the US and Canada have all experienced extreme winters in recent years. Just last year, for example, the UK had its second-coldest March since records began, prompting the Met Office to call a  rapid response meeting of experts to get to grips with whether melting Arctic sea-ice could be affecting British weather.

The new study, published in Nature Geoscience, suggests the likelihood of severe winters in central Asia has doubled over the past decade. This vast region includes southern Russia, Mongolia, Kazakhstan, and northern China. And it's the Arctic that's driving the changes once again, the authors say.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Noncovalent, Self-Assembled, Robust, Porous Material That Adsorbs Greenhouse Gas

Noncovalent, Self-Assembled, Robust, Porous Material That Adsorbs Greenhouse Gas | Amazing Science |

Researchers from the Department of Chemistry at the University of Houston have created noncovalent organic frameworks, a new type of porous material that overcomes some barriers in the development of porous material technologies.  The new wonder material is highly processable, self-assembled, possessing of a superstructure with large, 16 angstrom pores (Figure above).  The material has a high affinity for hydrocarbons suggesting applications for use as an energy storage substrate.  In addition, the material also captures CFCs and fluorocarbons, both potent greenhouse gas species.  The capture capacity is up to 75% of the original weight.

The field of porous materials has experienced two other, prior twin advances in the area of metal-organic and covalent organic frameworks though they are plagued by the problem of low processability as the extended crystalline structure makes them impossible to dissolve without decomposition.

Remarkably, the building block of the noncovalent porous material is a single molecule trispyrazole, which stack and self-assemble into a large, porous, crystal-like configuration.  The author characterizes the pores as “infinite one-dimensional channels protruding throughout the crystal along the crystallographic c axis”.  The interior “lining” of the channels is arrayed with fluorines.

The entire super structure is stabilized by noncovalent hydrogen bonds and “pi-pi” stacking – hallmarks of a “supramolecular” material.  H-bonds and pi-interactions  are considered “weak” associations between molecules, but by virtue of the sheer number and surface area of interactions, the material turns out to be thermally very stable (up to 250 degrees C) and resistant to solvents, acids and bases.  Engineers interested in manipulating the material would find most interesting that its solubility in DMSO can be tuned by temperature.

Of great interest in porous materials is measurement of the “effective surface area” in the pores, for a given weight of the porous material.  A common measure of the surface area is the Brunauer–Emmett–Teller surface area.  Using nitrogen adsorption measurements the surface area was determined to be 1,159 m2 g−1.  For comparison activated charcoal used in water filters has a surface area of about 500 m2 g−1.  The high surface area is the reason for the high capture weight proportion (75%).

No comment yet.
Scooped by Dr. Stefan Gruenwald!

What Happens After Someone Survives Ebola?

What Happens After Someone Survives Ebola? | Amazing Science |

While most of the recent coverage of the ongoing Ebola outbreak has focused on rising death tolls and a few infected U.S. citizens, other segments of the population have passed mostly unnoticed from the harsh glare of the media spotlight: Survivors, and those who are seemingly immune to Ebola.

People who survive Ebola can lead normal lives post-recovery, though occasionally they can suffer inflammatory conditions of the joints afterwards, according to CBS. Recovery times can vary, and so can the amount of time it takes for the virus to clear out of the system.

The World Health Organization found that the virus can reside in semen for up to seven weeks after recovery. Survivors are generally assumed to be immune to the particular strain they are infected by, and are able to help tend to others infected with the same strain. What isn't clear is whether or not a person is immune to other strains of Ebola, or if their immunity will last.

As with most viral infections, patients who recover from Ebola end up with Ebola-fighting antibodies in their blood, making their blood a valuable (if controversial) treatment option for others who catch the infection. Kent Brantly, one of the most recognizable Ebola survivors, has donated more than a gallon of his blood to other patients. The plasma of his blood, which contains the antibodies, is separated out from the red blood cells, creating what’s known as a convalescent serum, which can then be given to a patient as a transfusion. The hope is that the antibodies in the serum will boost the patient’s immune response, attacking the virus, and allowing the body to recover.

But this treatment method, like all Ebola treatment methods, is far from ideal. To start with, scientists aren't even sure if it works. In addition, the serum can only be donated to people with a compatible blood type to the donor, and it’s unclear how long the immunity would last. Adding to the confusion, there are several different strains of Ebola, and there’s no guarantee that once someone has recovered from one strain of Ebola they are immune to others.

When Nancy Writebol, one of the survivors of Ebola who was whisked back to Atlanta soon after contracting the virus, was asked by Science Magazine if she would consider going back, she said: “I’ve done some reading on that and talked to doctors at Emory about that. My doctors at Emory are not sure how long immunity would last. It’s not been studied. I’ve read that even if a survivor was willing and able to help with the care for Ebola patients, because there are so many strains of Ebola, it would still be very wise and necessary to operate in PPEs and not just assume you’re immune.”

People who survived the disease are of particular interest to researchers, such as those working on the ZMapp drug, who hope that they can synthesize antibodies in the hopes of creating a cure.

But even less understood than the survivors are the people who were infected with Ebola but never developed any symptoms. After outbreaks in Uganda in the late 1990’s, scientists tested the blood of several people who were in close contact with Ebola patients, and found a number of them had markers in their blood indicating they carried the disease, but they were totally asymptomatic—they managed to completely avoid the horrifying symptoms of the disease.

In a letter in the Lancet this week, researchers look into the existence of these asymptomatic patients, and hope that identifying people who are naturally immune could help contain the outbreak as scientists work on developing a treatment. A 2010 study published by the French research organization IRD found that as much as 15.3 percent of Gabon's population could be immune to Ebola.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

NASA: September Was Warmest Month on Record

NASA: September Was Warmest Month on Record | Amazing Science |

The global average temperature in September was the warmest in a record dating back to 1880, according to an update from NASA’sGoddard Institute for Space Studies. That makes it two months in a row: August was also the hottest on record by NASA’s reckoning. Later this week, the National Oceanic and Atmospheric Administration will release its own, independent calculation of how September 2014 stacked up. Sometimes NOAA’s calculation differs.

Unless something really weird happens, 2014 is on track to be the warmest in the instrumental record.

The map above shows how temperatures around the globe varied from the long-term average in September. Two things catch my eye:

  • The warmth bedeviling roughly the western third of the United States and all the way up into Alaska. This is the result of a high-pressure ridge that has remained stubbornly and remarkably persistent for many months (with some short-term ebbing and flowing, to be sure). The ridge has also prevented storms from reaching California, bringing record drought. The flip side is a trough of low pressure across the U.S. mid-section, which has brought cool temperatures this year, although in September, the map indicates temperatures close to normal across that region. (For more detail on this pattern, check out this post from the California Weather Blog.)

  • The tongue of warm ocean water extending west from the coast of South America out into the central Pacific also catches my attention. Warm sea surface temperatures persisting in the eastern and central Pacific comprise the signature of an El Niño trying to be born. Labor pains have been ongoing for quite some time now, and the odds are good that a weak El Niño will emerge in the next couple of months.
No comment yet.
Scooped by Dr. Stefan Gruenwald!

The Berkeley Blog: Ebola infection numbers still doubling every month

The Berkeley Blog: Ebola infection numbers still doubling every month | Amazing Science |

CDC projections show the epidemic could affect between 550,000 to 1.4 million people by January if the international response doesn’t improve. We’ll come back to that last point soon, but for now, let’s stick with the numbers.

WHO analysis released the same day noted that this strain of Ebola is killing about 70 percent of people it infects, meaning the 550,000 case — the CDC’s lower number — could produce 385,000 deaths. That’s the death toll of a good-sized, many-year civil war. In about 10 months.

How could that happen so quickly? At the end of August, there were 3,069 known cases. At the end of September, the total was 7,157.

The numbers are still in the thousands, but look at how quickly they’re growing. The CDC estimates the number of cases is doubling every 15-20 days In Liberia and every 30-40 days in Sierra Leone. The World Health Organization, which released a separate analysis the same day the CDC projections came out, estimated doubling times were 24 days in Liberia and every 30 days in Sierra Leone.

Those kinds of doubling times produce growth curves that look like this. The WHO team also made future projections. Those are more conservative than the CDC’s, in part because they only go through November 2, at which point they project 20,000 cases.

The CDC and WHO did different things. And in some ways, the WHO paper is the most compelling. It provides the most detailed description of the epidemic to date, with demographic and geographic breakdowns that can help with planning. WHO researchers also estimated the case fatality rate—the 70 percent death toll I cited above, and the reproductive number.

The reproductive number (R) for a disease is the number of people one sick person will infect. If R is greater than 1 — if a sick person will cause more than one new infection — the epidemic will grow. If it’s less than 1, the epidemic will die out. WHO estimated that at the start of this epidemic, the reproductive number for Ebola was 1.71 in Guinea, 1.83 in Liberia, and 2.02 in Sierra Leone. By August, R was still above 1 in all three, but it appeared to be coming down in Liberia and Sierra Leone; the situation in Guinea was less clear.

None of this is to say the CDC analysis wasn’t important, too. Looking months into the future is a good idea. And the CDC researchers did something the WHO analysis failed to do — they estimated the number of cases that have not made it into official tallies. That’s how they came up with the 1.4 million figure. However, as both the paper and officials repeatedly noted, the 550,000 to 1.4 million numbers represent a possible “worst case” outcome—one that could happen if there were no effort to slow the epidemic.

NIH Ebola Information

No comment yet.
Scooped by Dr. Stefan Gruenwald!

WHO: What we know about transmission of the Ebola virus among humans

WHO: What we know about transmission of the Ebola virus among humans | Amazing Science |
Ebola virus disease is not an airborne infection. Airborne spread among humans implies inhalation of an infectious dose of virus from a suspended cloud of small dried droplets.

This mode of transmission has not been observed during extensive studies of the Ebola virus over several decades.

Common sense and observation tell us that spread of the virus via coughing or sneezing is rare, if it happens at all. Epidemiological data emerging from the outbreak are not consistent with the pattern of spread seen with airborne viruses, like those that cause measles and chickenpox, or the airborne bacterium that causes tuberculosis.

Theoretically, wet and bigger droplets from a heavily infected individual, who has respiratory symptoms caused by other conditions or who vomits violently, could transmit the virus – over a short distance – to another nearby person.

This could happen when virus-laden heavy droplets are directly propelled, by coughing or sneezing (which does not mean airborne transmission) onto the mucus membranes or skin with cuts or abrasions of another person.

WHO is not aware of any studies that actually document this mode of transmission. On the contrary, good quality studies from previous Ebola outbreaks show that all cases were infected by direct close contact with symptomatic patients.

The Ebola virus is transmitted among humans through close and direct physical contact with infected bodily fluids, the most infectious being blood, feces and vomit.

The Ebola virus has also been detected in breast milk, urine and semen. In a convalescent male, the virus can persist in semen for at least 70 days; one study suggests persistence for more than 90 days.

Saliva and tears may also carry some risk. However, the studies implicating these additional bodily fluids were extremely limited in sample size and the science is inconclusive. In studies of saliva, the virus was found most frequently in patients at a severe stage of illness. The whole live virus has never been isolated from sweat.

The Ebola virus can also be transmitted indirectly, by contact with previously contaminated surfaces and objects. The risk of transmission from these surfaces is low and can be reduced even further by appropriate cleaning and disinfection procedures.

NIH Ebola Information

Eric Chan Wei Chiang's curator insight, October 10, 2014 2:39 AM

These are some really good facts about the current Ebola outbreak.


Local authorities in affected countries are making creative use of ICT to help fight Ebola


More scoops on Ebola can be read here:

Scooped by Dr. Stefan Gruenwald!

High Efficiency Achieved for Harvesting Hydrogen Fuel From the Sun using Earth-Abundant Materials

High Efficiency Achieved for Harvesting Hydrogen Fuel From the Sun using Earth-Abundant Materials | Amazing Science |

Today, the journal Science published the latest development in Michael Grätzel’s laboratory at EPFL: producing hydrogen fuel from sunlight and water. By combining a pair of solar cells made with a mineral called perovskite and low cost electrodes, scientists have obtained a 12.3 percent conversion efficiency from solar energy to hydrogen, a record using earth-abundant materials as opposed to rare metals.

The race is on to optimize solar energy’s performance. More efficient silicon photovoltaic panels, dye-sensitized solar cells, concentrated cells and thermodynamic solar plants all pursue the same goal: to produce a maximum amount of electrons from sunlight. Those electrons can then be converted into electricity to turn on lights and power your refrigerator.

At the Laboratory of Photonics and Interfaces at EPFL, led by Michael Grätzel, where scientists invented dye solar cells that mimic photosynthesis in plants, they have also developed methods for generating fuels such as hydrogen through solar water splitting. To do this, they either use photoelectrochemical cells that directly split water into hydrogen and oxygen when exposed to sunlight, or they combine electricity-generating cells with an electrolyzer that separates the water molecules.

By using the latter technique, Grätzel’s post-doctoral student Jingshan Luo and his colleagues were able to obtain a performance so spectacular that their achievement is being published today in the journal Science. Their device converts into hydrogen 12.3 percent of the energy diffused by the sun on perovskite absorbers – a compound that can be obtained in the laboratory from common materials, such as those used in conventional car batteries, eliminating the need for rare-earth metals in the production of usable hydrogen fuel.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

World's first "solar battery" runs on light and air and stores its own power

World's first "solar battery" runs on light and air and stores its own power | Amazing Science |

Researchers at The Ohio State University have invented a solar battery -- a combination solar cell and battery -- which recharges itself using air and light. The design required a solar panel which captured light, but admitted air to the battery. Here, scanning electron microscope images show the solution: nanometer-sized rods of titanium dioxide (larger image) which cover the surface of a piece of titanium gauze (inset). The holes in the gauze are approximately 200 micrometers across, allowing air to enter the battery while the rods gather light. Image courtesy of Yiying Wu, The Ohio State University.

When the battery discharges, it chemically consumes oxygen from the air to re-form the lithium peroxide. An iodide additive in the electrolyte acts as a “shuttle” that carries electrons, and transports them between the battery electrode and the mesh solar panel. The use of the additive represents a distinct approach on improving the battery performance and efficiency, the team said.

The mesh belongs to a class of devices called dye-sensitized solar cells, because the researchers used a red dye to tune the wavelength of light it captures.

In tests, they charged and discharged the battery repeatedly, while doctoral student Lu Ma used X-ray photoelectron spectroscopy to analyze how well the electrode materials survived—an indication of battery life.

First they used a ruthenium compound as the red dye, but since the dye was consumed in the light capture, the battery ran out of dye after eight hours of charging and discharging—too short a lifetime. So they turned to a dark red semiconductor that wouldn’t be consumed: hematite, or iron oxide—more commonly called rust.

Coating the mesh with rust enabled the battery to charge from sunlight while retaining its red color. Based on early tests, Wu and his team think that the solar battery’s lifetime will be comparable to rechargeable batteries already on the market.

No comment yet.