Amazing Science
655.9K views | +74 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Rescooped by Dr. Stefan Gruenwald from Cool New Tech!

Scientists make lightweight wire from carbon that may rival copper

Scientists make lightweight wire from carbon that may rival copper | Amazing Science |

Ten times lighter than copper and 30 times stronger — scientists at Cambridge University are hoping carbon nanotubes will replace copper as a way to conduct electricity in the future.


Scientists have made a strong, lightweight wire from carbon that might eventually be a rival to copper if its ability to conduct electricity can be improved, Cambridge University said.


They said it was the first time that the super-strong carbon wires, spun in a tiny furnace that looks like a cotton candy machine with temperatures above 1,800 F, had been made "in a usable form" a millimeter thick.


Krzysztof Koziol of the University's department of materials science and metallurgy told Reuters in a telephone interview that commercial applications were still years away but that "our target is to beat copper".


Wire made in the laboratory from carbon nanotubes (CNTs) — microscopic hollow cylinders composed of carbon atoms — is 10 times lighter than copper and 30 times stronger, the university said in a statement.

Via Kalani Kirk Hausman
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Graphene-based supercapacitors a step closer to commerical reality

Graphene-based supercapacitors a step closer to commerical reality | Amazing Science |

Graphene-based supercapacitors have already proven the equal of conventional supercapacitors – in the lab. But now researchers at Melbourne’s Monash University claim to have developed of a new scalable and cost-effective technique to engineer graphene-based supercapacitors that brings them a step closer to commercial development.


With their almost indefinite lifespan and ability to recharge in seconds, supercapacitors have tremendous energy-storage potential for everything from portable electronics, to electric vehicles and even large-scale renewable energy plants. But the drawback of existing supercapacitors has been their low energy density of around 5 to 8 Wh/liter, which means they either have to be exceedingly large or recharged frequently.


Professor Dan Li and his team at Monash University’s Department of Materials Engineering has created a graphene-based supercapacitor with an energy density of 60 Wh/liter, which is around 12 times higher than that of commercially available supercapacitors and in the same league as lead-acid batteries. The device also lasts as long as a conventional battery.


To maximize the energy density, the team created a compact electrode from an adaptive graphene gel film they had previously developed. To control the spacing between graphene sheets on the sub-nanometer scale, the team used liquid electrolytes, which are generally used as the conductor in conventional supercapacitors.


Unlike conventional supercapacitors that are generally made of highly porous carbon with unnecessarily large pores and rely on a liquid electrolyte to transport the electrical charge, the liquid electrolyte in Li’s team’s supercapacitor plays a dual role of conducting electricity and also maintaining the minute space between the graphene sheets. This maximizes the density without compromising the supercapcitor’s porosity, they claim.


To create their compact electrode, the researchers used a technique similar to one used in traditional paper making, which they say makes the process cost-effective and easily scalable for industrial applications.


"We have created a macroscopic graphene material that is a step beyond what has been achieved previously. It is almost at the stage of moving from the lab to commercial development," Professor Li said.

asysan's curator insight, May 13, 2015 8:54 AM
Esto lo hago en clase de 4º de informática
Scooped by Dr. Stefan Gruenwald!

'Total Recall' for Mice: Implanting False Memories into a Mouse Brain

'Total Recall' for Mice: Implanting False Memories into a Mouse Brain | Amazing Science |

Our imperfect memory is inconvenient at the grocery store and downright dangerous on the witness stand. In extreme cases, we may be confident that we remember something that never happened at all. Now, a group of neuroscientists say that they’ve identified a potential mechanism of false memory creation and have planted such a memory in the brain of a mouse.

Neuroscientists are only beginning to tackle the phenomenon of false memory, says Susumu Tonegawa of the Massachusetts Institute of Technology in Cambridge, whose team conducted the new research. “It’s there, and it’s well established,” he says, “but the brain mechanisms underlying this false memory are poorly known.” With optogenetics—the precise stimulation of neurons with light—scientists can seek out the physical basis of recall and even tweak it a bit, using mouse models.


Like us, mice develop memories based on context. When a mouse returns to an environment where it felt pain in the past, it recalls that experience and freezes with fear. Tonegawa’s team knew that the hippocampus, a part of the brain responsible for establishing memory, plays a role in encoding context-based experiences, and that stimulating cells in a part of the hippocampus called the dentate gyrus can make a mouse recall and react to a mild electric shock that it received in the past. The new goal was to connect that same painful shock memory to a context where the mouse had not actually received a shock.


First, the team introduced a mouse to a chamber that it had never seen before and allowed it to explore the sights and smells: a black floor, dim red light, and the scent of acetic acid. In this genetically modified variety of mouse, neurons in the hippocampus will produce a light-sensitive protein when they become active. Because only the neurons involved in the mouse’s experience of this chamber became sensitive to light, these cells were essentially labeled for later reactivation.


The next day, the mouse found itself in a decidedly more unpleasant chamber: The lights, colors, and smells were all different, and it received a series of mild electric shocks to its feet. While the mouse was getting shocked, the scientists used optical fibers implanted in its brain to shine pulses of blue light on its dentate gyrus, reactivating specific cells that had been labeled the day before as the mouse explored the first, less painful chamber. The hope was that the mouse would form a new (and totally false) association between the first room and the painful shocks.


Even though the mouse never got shocked in the red-and-black, acid-scented room, it froze in fear when it returned there, confirming that it had formed a false, context-specific memory, the team reports online today in Science. Tonegawa says that it’s impossible to know just what the mouse experienced as the scientists stimulated its brain with light—whether it felt some or all of those earlier sensations, or even perceived that it was back in the first chamber during the shocks. But it is clear that the rodent recalled a painful experience when it returned to that first environment. It showed no signs of fear when placed in a third, unfamiliar chamber, demonstrating that the fear response was indeed triggered by the first room.


Tonegawa suggests that these results could help explain some of the cases in which humans form false memories. We are constantly imagining, daydreaming, and remembering, and these activities might alter our experience of the events around us, he says. He offers the extreme example of a woman who was watching a TV show at home when someone broke in and assaulted her. She later insisted that the host of the show had been her attacker, apparently transplanting the object of her attention into a memory of the physical experience.


The results are “clear and strong,” and the work is “a very profound finding,” says neuroscientist Mark Mayford of the Scripps Research Institute in San Diego, California, who was not involved in the study. No previous experiment has shown that activating a precise pattern of cells can serve as a substitute for a real-life experience and create a learned behavior, he says. Mayford, whose work also focuses on learning and memory manipulation in mice, says it’s theoretically possible that humans form false memories in a similar way. But more importantly, he says, the research offers clues about where and how a new experience gets encoded in the brain to begin with. With this knowledge, he believes that neuroscientists can start to take a more quantitative approach, someday figuring out how many neurons it takes to give us the perception of what’s around us and what goes on in our neural wiring when we remember—or misremember—the past.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Researchers Reveal Hidden Magnetic Waves in High-Temperature Superconductors

Researchers Reveal Hidden Magnetic Waves in High-Temperature Superconductors | Amazing Science |

New research from the Brookhaven National Laboratory has revealed that magnetic excitations, quantum waves believed by many to regulate high-temperature superconductors, exist in both non-superconducting and superconducting materials.

Intrinsic inefficiencies plague current systems for the generation and delivery of electricity, with significant energy lost in transit. High-temperature superconductors (HTS)—uniquely capable of transmitting electricity with zero loss when chilled to subzero temperatures—could revolutionize the planet’s aging and imperfect energy infrastructure, but the remarkable materials remain fundamentally puzzling to physicists. To unlock the true potential of HTS technology, scientists must navigate a quantum-scale labyrinth and pin down the phenomenon’s source.


Now, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory and other collaborating institutions have discovered a surprising twist in the magnetic properties of HTS, challenging some of the leading theories. In a new study, published online in the journal Nature Materials on August 4, 2013, scientists found that unexpected magnetic excitations—quantum waves believed by many to regulate HTS—exist in both non-superconducting and superconducting materials.


“This is a major experimental clue about which magnetic excitations are important for high-temperature superconductivity,” said Mark Dean, a physicist at Brookhaven Lab and lead author on the new paper. “Cutting-edge x-ray scattering techniques allowed us to see excitations in samples previously thought to be essentially non-magnetic.”


On the atomic scale, electron spins—a bit like tiny bar magnets pointed in specific directions—rapidly interact with each other throughout magnetic materials. When one spin rotates, this disturbance can propagate through the material as a wave, tipping and aligning the spins of neighboring electrons. Many researchers believe that this subtle excitation wave may bind electrons together to create the perfect current conveyance of HTS, which operates at slightly warmer temperatures than traditional superconductivity.


“Proving or disproving this hypothesis remains one of the holy grails of condensed matter physics research,” Dean said. “This discovery gives us a new way to evaluate rival theories of HTS.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Hubble Reveals a New Kind of Stellar Blast Called a Kilonova

Hubble Reveals a New Kind of Stellar Blast Called a Kilonova | Amazing Science |

NASA’s Hubble Space Telescope revealed a new type of stellar explosion called a kilonova. Kilonovas are about 1,000 times brighter than a nova, but they are 1/10th to 1/100th the brightness of a typical supernova.

NASA’s Hubble Space Telescope recently provided the strongest evidence yet that short-duration gamma ray bursts are produced by the merger of two small, super-dense stellar objects.


The evidence is in the detection of a new kind of stellar blast called a kilonova, which results from the energy released when a pair of compact objects crash together. Hubble observed the fading fireball from a kilonova last month, following a short gamma ray burst (GRB) in a galaxy almost 4 billion light-years from Earth. A kilonova had been predicted to accompany a short-duration GRB, but had not been seen before.


“This observation finally solves the mystery of the origin of short gamma ray bursts,” said Nial Tanvir of the University of Leicester in the United Kingdom. Tanvir lead a team of researchers using Hubble to study the recent short-duration GRB. “Many astronomers, including our group, have already provided a great deal of evidence that long-duration gamma ray bursts (those lasting more than two seconds) are produced by the collapse of extremely massive stars. But we only had weak circumstantial evidence that short bursts were produced by the merger of compact objects. This result now appears to provide definitive proof supporting that scenario.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

World's Smallest Terahertz Detector Invented by University of Pittsburgh Physicists

World's Smallest Terahertz Detector Invented by University of Pittsburgh Physicists | Amazing Science |

Molecules could soon be “scanned” in a fashion similar to imaging screenings at airports, thanks to a detector developed by University of Pittsburgh physicists. The detector, featured in a recent issue of Nano Letters, may have the ability to chemically identify single molecules using terahertz radiation—a range of light far below what the eye can detect.

“Our invention allows lines to be ‘written’ and ‘erased’ much in the manner that an Etch A Sketch® toy operates,” said study coauthor Jeremy Levy, professor in the Department of Physics and Astronomy within the Kenneth P. Dietrich School of Arts and Sciences. “The only difference is that the smallest feature is a trillion times smaller than the children’s toy, able to create conductive lines as narrow as two nanometers.”

Terahertz radiation refers to a color range far beyond what the eye can see and is useful for identifying specific types of molecules. This type of radiation is generated and detected with the help of an ultrafast laser, a strobe light that turns on and off in less than 30 femtoseconds (a unit of time equal to 10-15- of a second). Terahertz imaging is commonly used in airport scanners, but has been hard to apply to individual molecules due to a lack of sources and detectors at those scales.

“We believe it would be possible to isolate and probe single nanostructures and even molecules—performing ‘terahertz spectroscopy’ at the ultimate level of a single molecule,” said Levy. “Such resolution will be unprecedented and could be useful for fundamental studies as well as more practical applications.”

Levy and his team are currently performing spectroscopy of molecules and nanoparticles. In the future, they hope to work with a C60, a well-known molecule within the terahertz spectrum. 

The oxide materials used for this research were provided by study coauthor Chang-Beom Eom, Theodore H. Geballe Professor and Harvey D. Spangler Distinguished Professor at the University of Wisconsin-Madison College of Engineering. 

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Surprising chemistry: Two differently colored crystals - orange and blue - from one chemical in the same flask

Surprising chemistry: Two differently colored crystals - orange and blue - from one chemical in the same flask | Amazing Science |

Chemists have unexpectedly made two differently colored crystals - one orange, the other blue - from one chemical in the same flask while studying a special kind of molecular connection called an agostic bond. The discovery, reported in Angewandte Chemie International Edition on July 29, is providing new insights into important industrial chemical reactions such as those that occur while making plastics and fuels.


"We were studying agostic bonds in a project to make liquid fuels like methanol from carbon dioxide to replace fuels we get from oil," said chemist Morris Bullock at the Department of Energy's Pacific Northwest National Laboratory. "We knew the molecule we were making would have an agostic bond, but we had no idea there'd be two flavors of these metal complexes."

While chemists have studied these bonds in chemicals in liquid form, no one had crystallized one chemical with multiple forms of its agostic bonds. And no one expected different forms to give rise to different colors.


Bonds come in many varieties in molecules. They string atoms together, sometimes forming a trunk and branches of atoms like a tree. But the trunk and branches of chemicals often fold up into a more compact shape, requiring additional weaker bonds to hold the shape in place. An agostic bond is one of these additional bonds, a shape-holder. They occur between a metal and a distant carbon-hydrogen bond along some chain, folding the chain back to the metal and pinning it there.


First discovered in the 1980s, agostic bonds frequently occur in catalysts because catalysts usually contain metals. This work will help researchers get a better handle on some catalytic reactions found in common industrial processes such as making plastic or fuels.


The metal in a catalyst is usually the reactive heart of the molecule. Bullock and postdoctoral chemist Edwin van der Eide knew an agostic bond in their catalyst would help protect the reactive metal from working at the wrong time: The carbon-hydrogen bond blocks the reactive metal until conditions are right, which in turn would help the scientists better control the catalytic reactions. So van der Eide set about producing and crystallizing catalysts that contain a metal atom — in this case, molybdenum.


In the lab, van der Eide's flask of chemicals held a molybdenum-containing molecule that turned the solution violet. He added another liquid to coax the molybdenum complex to crystallize, just as salt crystallizes from seawater to form flakes at the seashore. Some crystals formed at the bottom of the flask and others formed near the top of the violet solution.


Oddly, the crystals were two different colors. Orange crystals formed at the bottom of the flask and blue above. If van der Eide dissolved either the orange or blue crystals in a fresh flask of the original solvent, the violet color returned, with the same properties as the original violet solution. These results suggested that either molecule in the two colored solids could give rise to both structures in liquid, where they easily change back and forth.


The researchers examined the differently colored crystals to determine their structures. The molecule forms a shape like a piano stool: a ringed section forms a stool seat on top of the molybdenum atom, with multiple legs connecting to the molybdenum at the bottom.


One of the legs, however, is longer than the others and contains a chain of three carbon atoms, each with at least one protruding hydrogen. The team found that the long leg was involved in the agostic bonds, with the middle carbon atom involved in the orange crystals and an end carbon involved in the blue crystals.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Amazing Science: Evolution Postings

Amazing Science: Evolution Postings | Amazing Science |

Evolution is the change in the inherited characteristics of biological populations over successive generations. Evolutionary processes give rise to diversity at every level of biological organisation, including species, individual organisms and molecules such as DNA and proteins.All life on Earth is descended from a last universal ancestor that lived approximately 3.8 billion years ago. Repeated speciation and the divergence of life can be inferred from shared sets of biochemical and morphological traits, or by shared DNA sequences.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Astronomers Discover Two Subdwarf Heavy Metal Stars

Astronomers Discover Two Subdwarf Heavy Metal Stars | Amazing Science |

A team of astronomers from Taiwan and the UK has discovered two unusual subdwarf stars with extremely high concentrations of lead in their atmospheres. These stars, labeled HE 2359-2844 and HE 1256-2738, have surfaces containing 10,000 times more lead than is present on the surface of the Sun.

HE 2359-2844 is a subdwarf located at a distance of 800 light-years away in the constellation of Sculptor. The star HE 1256-2738 is located about 1,000 light years away in the constellation of Hydra.

The scientists using observations from the archives of the ESO’s Very Large Telescope in Chile identified a few features in spectra of both stars that did not match any atoms expected to be present. After some detective work, they realized that the features were due to lead. Lead is one of the heaviest naturally occurring elements. In the Sun there is less than one lead atom for every ten billion hydrogen atoms.


At around 38,000 degrees Celsius, the surfaces of HE 2359-2844 and HE 1256-2738 are so hot that three electrons are removed from every lead atom. The resulting ions produce distinctive lines in the star’s spectrum, from which the concentration of lead in the atmosphere can be measured. Using the same technique, HE 2359-2844 was also found to show ten thousand times more yttrium and zirconium than on the Sun. Along with the zirconium star LS IV-14 116, the newly discovered stars form a new group of ‘heavy metal subdwarfs.’

No comment yet.
Scooped by Dr. Stefan Gruenwald!

D-Wave’s quantum optimizer might be quantum after all -- clearly isn't classical

D-Wave’s quantum optimizer might be quantum after all -- clearly isn't classical | Amazing Science |

Quantum optimizer manufacturer D-Wave Systems has been gaining a lot of traction recently. They've sold systems to Lockheed Martin and Google, and started producing results showing that their system can solve problems that are getting closer to having real-life applications. All in all, they have come a long way since the first hype-filled announcement.


According to a recent paper in Nature Communications, the D-Wave device is not doing classical simulated annealing. Which, unfortunately, means exactly that. It tells us what it isn't, but doesn't tell us what it is.


To go into this a little more deeply, the researchers analyzed how the coupling between the magnets created a ground state. The layout of the hardware consists of four inner magnets arranged in a diamond (so each magnet is coupled directly to two others). Each of these is coupled to one additional magnet, but those are not coupled to each other. This configuration appears to be set up such that the four inner magnets always have the same orientation, while the outer magnets are free to arrange themselves as they see fit.


This results in a rather strange set of 17 possible ground states, most of which can be reached in steps of single flips of magnets. Except for the last, which requires that all four inner magnets flip at the same time.


In a classical simulation, the set of magnets can sample many different states. But, if by chance it happens to flip into this last ground state, it becomes trapped there. Furthermore, once it is there, the outer magnets become trapped in a single state too, because all other configurations have higher energy. Of course, once in this isolated state, it can also get out by flipping all four inner magnets, but the isolation and lack of noise (the outer magnets can't flip either) mean that it is, in some sense, less likely to flip out of the state than into it.


In the quantum description of these events, this doesn't happen. After setting up the ground state, we start trying to move to the solution state (by varying the environment). As soon as we do that, the ground state splits up, and the isolated state where things get stuck raises up in energy, away from the ground state. Since everything is kept in the ground state, it is no surprise that we find that the probability of entering the isolated state reduces sharply.


But, notice that this is different from the classical case. In the classical case, there was no way to break up the ground state. In other words, the energetic descriptions of the classical and quantum ground states are not the same, and it is no surprise that they give two different results.


At heart, this difference was inevitable. When you get right down to it, we live in a quantum world, and if you are careful enough, that will shine through. In some ways, this shows how sloppy our thinking about the whole thing is. When we think of simulated annealing, or anything else like this, we imagine a purely classical or a purely quantum system. In reality, things are a lot more messy, with some aspects remaining classical and others showing their quantum nature.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

First time the quantum measurement barrier has been broken in a full scale gravitational wave detector

First time the quantum measurement barrier has been broken in a full scale gravitational wave detector | Amazing Science |
Researchers have moved one step closer towards detecting the existence of gravitational waves.


Nearly a century after the world's greatest physicist, Albert Einstein, first predicted the existence of gravitational waves, a global network of gravitational wave observatories has moved a step closer to detecting the faint radiation that could lead to important new discoveries in our universe.

David Blair is a Winthrop Professor of Physics at The University of Western Australia and Director of the Australian International Gravitational Research Centre at Gingin - 87km north of Perth.  He leads the WA component of a huge international team that has announced a demonstration of a new measurement technique called ‘quantum squeezing' that allows gravitational wave detectors to increase their sensitivity.


"This is the first time the quantum measurement barrier has been broken in a full scale gravitational wave detector," Professor Blair said.  "This is like breaking the sound barrier: some people said it would be impossible.  Breaking that barrier proved that supersonic flight was possible and today we know that it is not a barrier at all. "This demonstration opens up new possibilities for more and more sensitive gravitational wave detectors."


Gravity waves are ripples in space generated by extreme cosmic events such as colliding stars, black holes, and supernova explosions, which carry vast amounts of energy at the speed of light. These events are thought to be happening about once a week within the range of new detectors. They should achieve first detection within a few years of beginning operation as their sensitivity is steadily improved.


With the addition of quantum squeezing, physicists will be able to see much more distant sources. However a southern hemisphere detector is needed to be able to pinpoint the location of signals and to reduce interference.


"Already gravitational wave detectors have been proved to be the most sensitive gravitational instruments ever created.  They measure motions measured in millionth of one millionth of one millionth of a metre.  The motions they detect are tiny, even compared to the size of a proton," Professor Blair said.


"The new results prove that the physicists are on track to take them to even higher levels of sensitivity.  This will open up the gravitational wave spectrum and allow humanity for the first time to hear the myriad of gravitational sounds that are thought to be constantly rippling through space at the speed of light."


In the research:  "Enhanced sensitivity of the LIGO gravitational wave detector by using squeezed states of light, published" in the journal Nature Photonics, squeezed vacuum is injected into the dark port of the beam splitter to improve the performance of one of the detectors of the Laser Interferometer Gravitational-Wave Observatory (LIGO) beyond the quantum noise limit.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

What is behind Einstein's turbulences? Calculations give initial insight into relativistic properties of this process

What is behind Einstein's turbulences? Calculations give initial insight into relativistic properties of this process | Amazing Science |

The American Nobel Prize Laureate for Physics Richard Feynman once described turbulence as "the most important unsolved problem of classical physics", because a description of the phenomenon from first principles does not exist. This is still regarded as one of the six most important problems in mathematics today. David Radice and Luciano Rezzolla from the Max Planck Institute for Gravitational Physics (Albert Einstein Institute/AEI) in Potsdam have now taken a major step toward solving this problem: For the first time, a new computer code has provided relativistic calculations that give scientists a better understanding of turbulent processes in regimes that can be found in astrophysical phenomena.

Turbulent flows are very common and play a major role in the dynamics of physical processes. We all come across turbulence on a daily basis, for example every time we mix milk and coffee, or in gasoline-air mixture in combustion engines, or in the diluted hot plasma of the intergalactic medium.

Already as far back as in the 15th century, turbulent vortices were studied by Leonardo da Vinci. In the 19th century, Claude Navier and George Stokes formulated equations that described the motion of fluids and gases. The corresponding "Navier-Stokes equations" can also be used to describe turbulence. However, using simple geometrical and energetic arguments, the Russian mathematician Andrey Kolmogorov developed during the Second World War a phenomenological theory for turbulence that is still valid today.

Despite Kolmogorov's predictions have been validated in a number of conditions, a fundamental mathematical theory of turbulence is still lacking. As a result, the "Analysis of the existence and regularity of solutions to the three-dimensional incompressible Navier-Stokes equations" is on the list of unsolved mathematical problems, for which the Clay Mathematics Institute in Cambridge, Massachusetts, offered prize money to the tune of one million US dollars in the year 2000 for its solution.

"Our studies showed that Kolmogorov's basic predictions for relativistic phenomena must be modified, because we are observing anomalies and new effects," says Rezzolla. "Interestingly, however, the most important prediction of Kolmogorov's theory appears to be still valid", notes Rezzolla when referring to the so-called -5/3 Kolmogorov law, which describes how the energy of a system is transferred from large to small vortices.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Team develops new water splitting technique that could produce hydrogen fuel cheaply using sunlight

Team develops new water splitting technique that could produce hydrogen fuel cheaply using sunlight | Amazing Science |
A University of Colorado Boulder team has developed a radically new technique that uses the power of sunlight to efficiently split water into its components of hydrogen and oxygen, paving the way for the broad use of hydrogen as a clean, green fuel.


The CU-Boulder team has devised a solar-thermal system in which sunlight could be concentrated by a vast array of mirrors onto a single point atop a central tower up to several hundred feet tall. The tower would gather heat generated by the mirror system to roughly 2,500 degrees Fahrenheit (1,350 Celsius), then deliver it into a reactor containing chemical compounds known as metal oxides, said CU-Boulder Professor Alan Weimer, research group leader.


As a metal oxide compound heats up, it releases oxygen atoms, changing its material composition and causing the newly formed compound to seek out new oxygen atoms, said Weimer. The team showed that the addition of steam to the system—which could be produced by boiling water in the reactor with the concentrated sunlight beamed to the tower—would cause oxygen from the water molecules to adhere to the surface of the metal oxide, freeing up hydrogen molecules for collection as hydrogen gas.


"We have designed something here that is very different from other methods and frankly something that nobody thought was possible before," said Weimer of the chemical and biological engineering department. "Splitting water with sunlight is the Holy Grail of a sustainable hydrogen economy."


A paper on the subject was published in the Aug. 2 issue of Science. The team included co-lead authors Weimer and Associate Professor Charles Musgrave, first author and doctoral student Christopher Muhich, postdoctoral researcher Janna Martinek, undergraduate Kayla Weston, former CU graduate student Paul Lichty, former CU postdoctoral researcher Xinhua Liang and former CU researcher Brian Evanko.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from green streets!

As self-driving cars move from fantasy to reality, what kind of effect will they have on cities?

As self-driving cars move from fantasy to reality, what kind of effect will they have on cities? | Amazing Science |

A fantastical research and urban prototyping project called Shuffle Cityinvestigates, and in the process, becomes a manifesto for a new kind of modern city--one that depends less on traditional public transportation like buses or light rail and more on creating a fleet of continuously moving automated vehicles to serve urban mobility needs.


Focusing on Houston--the country’s car-oriented fourth largest city--the project "identifies opportunities outside of the ownership model to liberate an otherwise suppressed urban landscape, by programming a dynamic system of flow that is made more immediately possible through a public autonomous (driverless) vehicle fleet," according to its website. The project wonders: "Is there a new model for American cities, in which mobility can reverse the effect of city centers consumed by the private motor car and its needs?"


Shuffle City looks at the new possibilities that could arise from cities transitioning away from cars with drivers to cars without drivers. If cars were put into some constant flow as a public good, and if people didn’t all have their own vehicles, there would be no need for the concrete wastelands and lifeless towers that serve as a parking infrastructure in the urban landscapes of car-centric cities like Phoenix and Los Angeles. Under the current ownership model, the average car spends 21 hours per day parked. The share of city space ruled by parking lots will shrink, making way for more green space, environmental buffers, workspace, housing, retail, and denser planning for more walkable cities.


Shuffle City includes maps of Houston that re-imagine the city with parking spaces cut out and filled in with new development, parks, and infrastructure. Calling itself "an alternative framework for future growing cities in America," the project is more of a visual exploration than a policy recommendation, and questions about radically altering the ownership model for automobiles in America are left unanswered. But the project’s bold take on unforeseen futures is thought-provoking all the same.

Via Lauren Moss
José Antônio Carlos - O Professor Pepe's curator insight, August 7, 2013 8:41 AM

Um desenho da cidade de nossos sonhos. Carros sem motoristas, ruas sem espaço para estacionamento, e por aí vai.

Kim Spence-Jones's curator insight, August 8, 2013 2:53 AM

Interface between cars and homes is an interesting area of R&D. Everything from entertainment synchronising to battery management.

miguel sa's curator insight, September 4, 2013 4:17 PM

Jacque Fresco has been talking about this sort of thing for awhile now, looks like its coming closer to reality~ 

Scooped by Dr. Stefan Gruenwald!

DNA Founder Closes in on Genetic Culprit for Undescribed Syndrome

DNA Founder Closes in on Genetic Culprit for Undescribed Syndrome | Amazing Science |

Hugh Rienhoff says that his nine-year-old daughter, Bea, is “a fire cracker”, “a tomboy” and “a very sassy, impudent girl”. But in a forthcoming research paper, he uses rather different terms, describing her hypertelorism (wide spacing between the eyes) and bifid uvula (a cleft in the tissue that hangs from the back of the palate). Both are probably features of a genetic syndrome that Rienhoff has obsessed over since soon after Bea’s birth in 2003. Unable to put on much muscle mass, Bea wears braces on her skinny legs to steady her on her curled feet. She is otherwise healthy, but Rienhoff has long worried that his daughter’s condition might come with serious heart problems.


Rienhoff, a biotech entrepreneur in San Carlos, California, who had trained as a clinical geneticist in the 1980s, went from doctor to doctor looking for a diagnosis. He bought lab equipment so that he could study his daughter’s DNA himself — and in the process, he became a symbol for the do-it-yourself biology movement, and a trailblazer in using DNA technologies to diagnose a rare disease (see Nature 449,773–776; 2007).


“Talk about personal genomics,” says Gary Schroth, a research and development director at the genome-sequencing company Illumina in San Diego, California, who has helped Rienhoff in his search for clues. “It doesn’t get any more personal than trying to figure out what’s wrong with your own kid.”


Now nearly a decade into his quest, Rienhoff has arrived at an answer. Through the partial-genome sequencing of his entire family, he and a group of collaborators have found a mutation in the gene that encodes transforming growth factor-β3 (TGF-β3). Genes in the TGF-β pathway control embryogenesis, cell differentiation and cell death, and mutations in several related genes have been associated with Marfan syndrome and Loeys–Dietz syndrome, both of which have symptomatic overlap with Bea’s condition. The mutation, which has not been connected to any disease before, seems to be responsible for Bea’s clinical features, according to a paper to be published in the American Journal of Medical Genetics.


Hal Dietz, a clinician at Johns Hopkins University School of Medicine in Baltimore, Maryland, where Rienhoff trained as a geneticist, isn’t surprised that the genetic culprit is in this pathway. “The overwhelming early hypothesis was that this was related,” says Dietz, who co-discovered Loeys–Dietz syndrome in 2005.


Rienhoff had long been tapping experts such as Dietz for assistance. In 2005, an examination at Johns Hopkins revealed Bea’s bifid uvula. This feature, combined with others, suggested Loeys–Dietz syndrome, which is caused by mutations in TGF-β receptors. But physicians found none of the known mutations after sequencing these genes individually. This was a relief: Loeys–Dietz is associated with devastating cardiovascular complications and an average life span of 26 years.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Using Infrared Date, Astronomers Image Lowest-mass Exoplanet Around a Sun-like Star

Using Infrared Date, Astronomers Image Lowest-mass Exoplanet Around a Sun-like Star | Amazing Science |

Using infrared data from the Subaru Telescope in Hawaii, an international team of astronomers has imaged a giant planet around the bright star GJ 504. Glowing a dark magenta, the newly discovered exoplanet GJ 504b weighs in with about four times Jupiter's mass, making it the lowest-mass planet ever directly imaged around a star like the sun.


"If we could travel to this giant planet, we would see a world still glowing from the heat of its formation with a color reminiscent of a dark cherry blossom, a dull magenta," said Michael McElwain, a member of the discovery team at NASA's Goddard Space Flight Center in Greenbelt, Md. "Our near-infrared camera reveals that its color is much more blue than other imaged planets, which may indicate that its atmosphere has fewer clouds."


According to the most widely accepted picture, called the core-accretion model, Jupiter-like planets get their start in the gas-rich debris disk that surrounds a young star. A core produced by collisions among asteroids and comets provides a seed, and when this core reaches sufficient mass, its gravitational pull rapidly attracts gas from the disk to form the planet.


While this model works fine for planets out to where Neptune orbits, about 30 times Earth's average distance from the sun (30 astronomical units, or AU), it's more problematic for worlds located farther from their stars. GJ 504b lies at a projected distance of 43.5 AU from its star; the actual distance depends on how the system tips to our line of sight, which is not precisely known.


"This is among the hardest planets to explain in a traditional planet-formation framework," explained team member Markus Janson, a Hubble postdoctoral fellow at Princeton University in New Jersey. "Its discovery implies that we need to seriously consider alternative formation theories, or perhaps to reassess some of the basic assumptions in the core-accretion theory."


The research is part of the Strategic Explorations of Exoplanets and Disks with Subaru (SEEDS), a project to directly image extrasolar planets and protoplanetary disks around several hundred nearby stars using the Subaru Telescope on Mauna Kea, Hawaii. The five-year project began in 2009 and is led by Motohide Tamura at the National Astronomical Observatory of Japan (NAOJ).

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New stamp-sized microfluidic chip sorts cells through a technique known as cell rolling

New  stamp-sized microfluidic chip sorts cells through a technique known as cell rolling | Amazing Science |

Early in 2012, MIT scientists reported on the development of a postage stamp-sized microchip capable of sorting cells through a technique, known as cell rolling, that mimics a natural mechanism in the body. The device successfully separated leukemia cells from cell cultures — but could not extract cells directly from blood. 

Now the group has developed a new microchip that can quickly separate white blood cells from samples of whole blood, eliminating any preliminary processing steps — which can be difficult to integrate into point-of-care medical devices. The hope, the researchers say, is to integrate the microchip into a portable diagnostic device that may be used to directly analyze patient blood samples for signs of inflammatory disease such as sepsis — particularly in regions of developing countries where diagnostic lab equipment is not readily available.


In their experiments, the scientists pumped tiny volumes of blood through the microchip and recovered a highly pure stream of white blood cells, virtually devoid of other blood components such as platelets and red blood cells. What’s more, the team found that the sorted cells were undamaged and functional, potentially enabling clinicians not only to obtain a white blood cell count, but also to use the cells to perform further genetic or clinical tests. 

Rohit Karnik, an associate professor of mechanical engineering at MIT, says the key to recovering such pure, functional cells lies in the microchip’s adaption of the body’s natural process of cell rolling. 

“We believe that because we’re using a very biomimetic process, the cells are happier,” Karnik says. “It’s a more gentle process, and the cells are functionally viable.”

H. Fai Poon's curator insight, October 17, 2013 12:56 AM

Now someone make it into a cell sorter please.

Scooped by Dr. Stefan Gruenwald!

Making a smartphone even smarter: Turning it into a biosensor for toxins and bacteria

Afraid there may be peanuts or other allergens hiding in that cookie? Thanks to a cradle and app that turn your smartphone into a handheld biosensor, you may soon be able to run on-the-spot tests for food safety, environmental toxins, medical diagnostics and more.

The handheld biosensor was developed by researchers at the University of Illinois, Urbana-Champaign. A series of lenses and filters in the cradle mirror those found in larger, more expensive laboratory devices. Together, the cradle and app transform a smartphone into a tool that can detect toxins and bacteria, spot water contamination and identify allergens in food.


Kenny Long, a graduate researcher at the university, says the team was able to make the smartphone even smarter with modifications to the cellphone camera.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Spray-Assisted Layer-By-Layer Functionalization of PRINT Built-To-Order Nanomedicine

Spray-Assisted Layer-By-Layer Functionalization of PRINT Built-To-Order Nanomedicine | Amazing Science |

A new coating technology developed at MIT, combined with a novel nanoparticle-manufacturing technology developed at the University of North Carolina at Chapel Hill, may offer scientists a way to quickly mass-produce tailored nanoparticles that are specially coated for specific applications, including medicines and electronics. 

Using this new combination of the two existing technologies, scientists can produce very small, uniform particles with customized layers of material that can carry drugs or other molecules to interact with their environment, or even target specific types of cells. 

Creating highly reproducible batches of precisely engineered, coated nanoparticles is important for the safe manufacture of drugs and obtaining regulatory approval, says Paula Hammond, the David H. Koch Professor in Chemical Engineering at MIT and a member of MIT’s Koch Institute for Integrative Cancer Research.

“Everyone’s excited about nanomedicine’s potential, and there are some systems that are making it out to market, but people are also concerned about how reproducible each batch is. That’s especially critical for applications such as cancer therapies,” Hammond says. “Fortunately, we have combined two technologies that are at the forefront of addressing these issues and that show great promise for the future of nanomanufacturing.”


Hammond’s lab previously developed a layer-by-layer deposition technique for coating nanoparticle surfaces with alternating layers of drugs, RNA, proteins or other molecules of interest. Those coatings can also be designed to protect nanoparticles from being destroyed by the body’s immune system before reaching their intended targets. 

“It’s a very versatile platform for incorporating therapeutics,” Hammond says. However, the layer-by-layer application processes commonly used today to coat nanoparticles take too long to be useful for rapid, large-scale manufacture: For each layer, the particles must be soaked in a solution of the coating material, then spun in a centrifuge to remove excess coating. Applying each layer takes about an hour.

In the new study, the MIT researchers used a spray-based technique, which allows them to apply each layer in just a few seconds. This technology was previously developed in the Hammond lab and is now being commercialized by Svaya Nanotechnologies. 

Hammond combined this approach with a nanoparticle-manufacturing technology known as the PRINT (Particle Replication In Non-wetting Templates) platform, which was developed in the DeSimone lab at UNC and is now being commercialized by Liquidia Technologies. Liquidia focuses on using the PRINT platform to create novel nanotechnology-based health-care products, vaccines and therapeutics.  

No comment yet.
Scooped by Dr. Stefan Gruenwald!

New technology offers 3D images inside colon, pointing toward better colonoscopy

New technology offers 3D images inside colon, pointing toward better colonoscopy | Amazing Science |

MIT researchers have developed a new endoscopy technology that could make it easier for doctors to detect precancerous lesions in the colon. Early detection of such lesions has been shown to reduce death rates from colorectal cancer, which kills about 50,000 people per year in the United States.

The new technique, known as photometric stereo endoscopy, can capture topographical images of the colon surface along with traditional two-dimensional images. Such images make it easier to see precancerous growths, including flatter lesions that traditional endoscopy usually misses, says Nicholas Durr, a research fellow in the Madrid-MIT M+Vision Consortium, a recently formed community of medical researchers in Boston and Madrid.


“In conventional colonoscopy screening, you look for these characteristic large polyps that grow into the lumen of the colon, which are relatively easy to see,” Durr says. “However, a lot of studies in the last few years have shown that more subtle, nonpolypoid lesions can also cause cancer.”


In the United States, colonoscopies are recommended beginning at age 50, and are credited with reducing the risk of death from colorectal cancer by about half. Traditional colonoscopy uses endoscopes with fiber-optic cameras to capture images.

Durr and his colleagues, seeking medical problems that could be solved with new optical technology, realized that there was a need to detect lesions that colonoscopy can miss. A technique called chromoendoscopy, in which a dye is sprayed in the colon to highlight topographical changes, offers better sensitivity but is not routinely used because it takes too long.


“What is attractive about this technique for colonoscopy is that it provides an added dimension of diagnostic information, particularly about three-dimensional morphology on the surface of the colon,” says Nimmi Ramanujam, a professor of biological engineering at Duke University who was not part of the research team.

The researchers built two prototypes — one 35 millimeters in diameter, which would be too large to use for colonoscopy, and one 14 millimeters in diameter, the size of a typical colonoscope. In tests with an artificial silicon colon, the researchers found that both prototypes could create 3-D representations of polyps and flatter lesions. 

The new technology should be easily incorporated into newer endoscopes, Durr says. “A lot of existing colonoscopes already have multiple light sources,” he says. “From a hardware perspective all they need to do is alternate the lights and then update their software to process this photometric data.” 

The researchers plan to test the technology in human patients in clinical trials at MGH and the Hospital Clinico San Carlos in Madrid. They are also working on additional computer algorithms that could help to automate the process of identifying polyps and lesions from the topographical information generated by the new system. 

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Russian meteor may have gangmates: 20 asteroids on similar paths to rock that exploded near Chelyabinsk

Russian meteor may have gangmates: 20 asteroids on similar paths to rock that exploded near Chelyabinsk | Amazing Science |

The house-sized rock that exploded spectacularly in the skies near Chelyabinsk, Russia, in February may have been a member of a gang of asteroids that still poses a threat to Earth, a new study says. The evidence is circumstantial, but future observations could help to settle the question.

On 15 February, an 11,000-tonne space rock slammed into the atmosphere above Russia, producing the most powerful impact since the Tunguska explosion in 1908 — which may also have been caused by an asteroid — and generating a shock wave that damaged buildings and injured more than 1,000 people. The 18-metre-wide object could not be seen as it approached the planet because it was obscured by the Sun's glare, but observations made while it was in the atmosphere have enabled several groups of researchers to estimate its orbit.


However, the estimates varied so much that there was no clear orbit that researchers could use to hunt for sibling asteroids on a similar path, say Carlos and Raúl de la Fuente Marcos, orbital dynamicist brothers at the Complutense University of Madrid. They decided to tackle the problem with brute computational force, running simulations of billions of possible orbits to find the ones most likely to have led to a collision. They then used the average of the ten best orbits to search a NASA asteroid catalogue for known objects on similar paths. They found about 20, ranging in size from 5 to 200 metres across, they report in an article to be published in Monthly Notices of the Royal Astronomical Society: Letters.

The researchers propose that these rocks are pieces of a rubbly asteroid that came apart some time in the past 40,000 years. The break-up may have been triggered by stresses from temperature swings as the parent asteroid looped out past Mars and then back towards Venus on its travels around the Sun, says Carlos de la Fuente Marcos.


The rocky fragments could one day follow their planet-pummelling sibling to Earth, he says. "More objects with the same orbital signature may encounter our planet in the future."


But don't panic just yet. The researchers acknowledge that the gravitational pull of the planets might have affected the path of each asteroid in a slightly different way, and that those differences would tend to grow over time. So even if the orbits of these objects were very similar at first, they could change completely.


No comment yet.
Scooped by Dr. Stefan Gruenwald!

Crab-Like Robot Walks Along the Ocean Floor to Investigate Shipwrecks

Crab-Like Robot Walks Along the Ocean Floor to Investigate Shipwrecks | Amazing Science |

As six-legged robots go, other than its nifty red and yellow paint job, the Crabster robot has a pretty standard look. It isn’t the biggest hexapod, like the impressive two-ton Mantis, or a tiny hexapod with a weird gait, like Boston Dynamics’ RHex. What makes Crabster special isn’t so much what it is but where it will walk—the robot was designed to navigate the seafloor.


Ocean researchers already use both autonomous and remote-control undersea vehicles, but propulsion systems tend to kick up sediment, adversely affecting visibility, and lack the power to deal with strong currents.

Crabster’s creators designed the robot to solve these problems. Developed by the Korea Institute of Ocean Science and Technology (KIOST), the robot can withstand heavy currents by changing its posture (roll, pitch, and yaw), and the robot’s measured gait won’t significantly disturb sediment.


Crabster is lowered to the seafloor by crane and remains attached to an umbilical for power, limiting where it can go but allowing for continuous operation. Four operators remotely drive the robot from the surface—directing and monitoring its movement, manipulators, cameras, lights, and sonar.

On the seafloor, the half-ton robot illuminates murky water with a spotlight, records what it sees with ten onboard cameras, and uses its two front legs to pick up and manipulate objects. Researchers hope to send Crabster to explore shipwrecks where they can return small treasures in the robot’s retractable tray. They’ll haul larger objects by attaching a tow cable connected to the vessel above.


Crabster recently took its first dip in the ocean and will soon head out to sea to begin work 200 meters below the surface. Eventually Crabster’s engineers hope to give it an onboard power source, and we imagine future iterations might combine the best of both worlds—a Crabster that folds its legs to go swimming and, when a stroll better suits its purposes, deploys its legs for a landing on the sea-floor.

Ron Peters's curator insight, October 17, 2013 1:08 PM

Interesting ROV/AUV twist...

Scooped by Dr. Stefan Gruenwald!

Researchers Unveil a Novel Double-Stranded Method for Highly Accurate SNP Genotyping

Researchers Unveil a Novel Double-Stranded Method for Highly Accurate SNP Genotyping | Amazing Science |

Scientists at Rice University and the University of Washington (UW) this week unveiled a groundbreaking new method for detecting minute changes known as single nucleotide polymorphisms (SNPs) in the human genome. The human genome has more than 6 billion base pairs, and one of the revelations of modern genomics is that even the slightest change in the sequence — a single-nucleotide difference — can have profound effects.


The new SNP genotyping technique, dubbed “double-stranded toehold exchange,” is described in a new paper in Nature Chemistry. The patent-pending method is markedly different — in both form and performance — than any of the dozen-plus methods already used to detect SNPs.


“There are two axes of performance in SNP detection — read length and specificity,” said study co-author David Zhang, who joined Rice’s Department of Bioengineering this month. “We’re at least an order of magnitude better on each axis. In fact, in terms of specificity, our theoretical work suggests that we can do quadratically better, meaning that whatever the best level of specificity is with a single-stranded method, our best will be that number squared.”


Scientists have sequenced the genomes of dozens of species, but those species-level genomes only tell part of the genetic story for a given individual. In people, for example, slight differences in just a few nucleotides can mean the difference between having green or brown eyes. This type of genetic variation within a species is called polymorphism, and SNPs are the smallest unit of polymorphic variation.


SNPs are the most frequently occurring genetic variation in the human genome; more than 30 million have been confirmed. But they also occur in other species, even in single-celled organisms. In the bacteria that cause tuberculosis (TB), for example, an SNP in the right location can allow the disease to fight off antibiotics like rifampicin, one of the most commonly prescribed anti-TB drugs. Though small on a molecular scale, that single-nucleotide difference has serious implications for TB patients. Rather than going through a six-month course of antibiotics costing about $20, patients with drug-resistant TB often face more than two years of treatment with drugs that sometimes have permanent side effects and can cost more than $2,500.


Selectivity is also an issue in SNP detection. The more selective a test is, the more likely it is to detect a rarely occurring SNP. For example, a TB patient might be infected with both drug-resistant and non-drug-resistant strains of TB at the same time.


“Maybe only 1 percent of the TB in the patient is resistant to rifampicin,” said Zhang, Rice’s Ted Law Jr. Assistant Professor of Bioengineering. “If you treat that person with rifampicin, the result is that you will kill the 99 percent and give the drug-resistant variety a chance to become well-established.”


An SNP test would need a selectivity of 100-to-1 to accurately diagnose the patient in the above example. Zhang said some current methods can approach that level of selectivity, but only if samples are prepared with painstaking attention to temperature, pH and other conditions.

“Our selectivity was about 12,000-to-1 in this study, and we don’t require any special conditions,” he said.


Zhang and Seelig said the selectivity of the technique derives from the use of double-stranded probes. Crafting these wasn’t easy, they said. The team had to add single-stranded “toeholds” to each end of the double-stranded probe to improve reaction kinetics and to speed the binding process enough to make the test practical. “We are moving forward with the goal of applying this technology to infectious disease diagnostics,” Seelig said.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Current climate change is occurring 10 times faster than at any time in past 65 million years

Current climate change is occurring 10 times faster than at any time in past 65 million years | Amazing Science |

Our planet is undergoing one of the largest changes in climate since the dinosaurs went extinct. But what might be even more troubling for humans, plants and animals is the speed of the change.


If this trend continues at its current rapid pace, it will place significant stress on terrestrial ecosystems around the world, and many species will need to make behavioral, evolutionary or geographic adaptations to survive. Although some of the changes the planet will experience in the next few decades are already "baked into the system," how different the climate looks at the end of the 21st century will depend largely on how humans respond.


The findings come from a review of climate research by Noah Diffenbaugh, an associate professor of environmental Earth system science, and Chris Field, a professor of biology and of environmental Earth system science and the director of the Department of Global Ecology at the Carnegie Institution. The work is part of a special report on climate change in the current issue of Science.


Diffenbaugh and Field, both senior fellows at the Stanford Woods Institute for the Environment, conducted the targeted but broad review of scientific literature on aspects of climate change that can affect ecosystems, and investigated how recent observations and projections for the next century compare to past events in Earth's history.


For instance, the planet experienced a 5 degree Celsius hike in temperature 20,000 years ago, as Earth emerged from the last ice age. This is a change comparable to the high-end of the projections for warming over the 20th and 21st centuries.


The geologic record shows that, 20,000 years ago, as the ice sheet that covered much of North America receded northward, plants and animals recolonized areas that had been under ice. As the climate continued to warm, those plants and animals moved northward, to cooler climes.

Jalpa Vyas's curator insight, August 4, 2013 12:20 PM

Nobody likes change but this one is inevitable and it is already quite evident that changes to our climate are occurring but this article suggests it is at a faster rate that has yet been recorded in the past.  We may need to be prepared for adapting sooner than we anticipated.

Molly Langstraat's curator insight, September 20, 2013 2:33 PM

I think that the change is inevidible. Humans and animals are going to have to learn to adjust as the climate continues to change. If we cut down on our pollution then the rate of climate change will slow. Humans need to learn how to help our Earth, not hurt it.