Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Google launches an effort to build its own quantum computer that has the potential to change computing forever. Google is about to begin designing and building hardware for a quantum computer, a type of machine that can exploit quantum physics to solve problems that would take a conventional computer millions of years. Since 2009, Google has been working with controversial startup D-Wave Systems, which claims to make “the first commercial quantum computer.” Last year, Google purchased one of D-Wave’s machines to be able to test the machine thoroughly. But independent tests published earlier this year found no evidence that D-Wave’s computer uses quantum physics at all to solve problems more efficiently than a conventional machine.
Now, John Martinis, a professor at University of California, Santa Barbara, has joined Google to establish a new quantum hardware lab near the university. He will try to make his own versions of the kind of chip inside a D-Wave machine. Martinis has spent more than a decade working on a more proven approach to quantum computing, and built some of the largest, most error-free systems of qubits, the basic building blocks that encode information in a quantum computer.
“We would like to rethink the design and make the qubits in a different way,” says Martinis of his effort to improve on D-Wave’s hardware. “We think there’s an opportunity in the way we build our qubits to improve the machine.” Martinis has taken a joint position with Google and UCSB that will allow him to continue his own research at the university.
Quantum computers could be immensely faster than any existing computer at certain problems. That’s because qubits working together can use the quirks of quantum mechanics to quickly discard incorrect paths to a solution and home in on the correct one. However, qubits are tricky to operate because quantum states are so delicate.
Chris Monroe, a professor who leads a quantum computing lab at the University of Maryland, welcomed the news that one of the leading lights in the field was going to work on the question of whether designs like D-Wave’s can be useful. “I think this is a great development to have legitimate researchers give it a try,” he says.
Since showing off its first machine in 2007, D-Wave has irritated academic researchers by making claims for its computers without providing the evidence its critics say is needed to back them up. However, the company has attracted over $140 million in funding and sold several of its machines (see “The CIA and Jeff Bezos Bet on Quantum Computing”).
There is no question that D-Wave’s machine can perform certain calculations. And research published in 2011 showed that the machine’s chip harbors the right kind of quantum physics needed for quantum computing. But evidence is lacking that it uses that physics in the way needed to unlock the huge speedups promised by a quantum computer. It could be solving problems using only ordinary physics.
Martinis’s previous work has been focused on the conventional approach to quantum computing. He set a new milestone in the field this April, when his lab announced that it could operate five qubits together with relatively low error rates. Larger systems of such qubits could be configured to run just about any kind of algorithm depending on the problem at hand, much like a conventional computer. To be useful, a quantum computer would probably need to be built with tens of thousands of qubits or more.
Martinis was a coauthor on a paper published in Science earlier this year that took the most rigorous independent look at a D-Wave machine yet. It concluded that in the tests run on the computer, there was “no evidence of quantum speedup.” Without that, critics say, D-Wave is nothing more than an overhyped, and rather weird, conventional computer. The company counters that the tests of its machine involved the wrong kind of problems to demonstrate its benefits.
Martinis’s work on D-Wave’s machine led him into talks with Google, and to his new position. Theory and simulation suggest that it might be possible for annealers to deliver quantum speedups, and he considers it an open question. “There’s some really interesting science that people are trying to figure out,” he says.
Safe and effective vaccines and drugs are needed for the prevention and treatment of Ebola virus disease, including following a potentially high-risk exposure such as a needlestick. To assess response to postexposure vaccination in a health care worker who was exposed to the Ebola virus.
Case report of a physician who experienced a needlestick while working in an Ebola treatment unit in Sierra Leone on September 26, 2014. Medical evacuation to the United States was rapidly initiated. Given the concern about potentially lethal Ebola virus disease, the patient was offered, and provided his consent for, postexposure vaccination with an experimental vaccine available through an emergency Investigational New Drug application. He was vaccinated on September 28, 2014. The vaccine used was VSVΔG-ZEBOV, a replicating, attenuated, recombinant vesicular stomatitis virus (serotype Indiana) whose surface glycoprotein gene was replaced by the Zaire Ebola virus glycoprotein gene. This vaccine has entered a clinical trial for the prevention of Ebola in West Africa.
The vaccine was administered 43 hours after the needlestick occurred. Fever and moderate to severe symptoms developed 12 hours after vaccination and diminished over 3 to 4 days. The real-time reverse transcription polymerase chain reaction results were transiently positive for vesicular stomatitis virus nucleoprotein gene and Ebola virus glycoprotein gene (both included in the vaccine) but consistently negative for Ebola virus nucleoprotein gene (not in the vaccine). Early postvaccination cytokine secretion and T lymphocyte and plasmablast activation were detected. Subsequently, Ebola virus glycoprotein-specific antibodies and T cells became detectable, but antibodies against Ebola viral matrix protein 40 (not in the vaccine) were not detected.
It is currently unknown if VSVΔG-ZEBOV is safe or effective for post-exposure vaccination in humans who have experienced a high-risk occupational exposure to the Ebola virus, such as a needlestick. In this patient, postexposure vaccination with VSVΔG-ZEBOV induced a self-limited febrile syndrome that was associated with transient detection of the recombinant vesicular stomatitis vaccine virus in blood. Strong innate and Ebola-specific adaptive immune responses were detected after vaccination. The clinical syndrome and laboratory evidence were consistent with vaccination response, and no evidence of Ebola virus infection was detected.
Leading commercial electronic cigarettes were tested to determine bulk composition. The e-cigarettes and conventional cigarettes were evaluated using machine-puffing to compare nicotine delivery and relative yields of chemical constituents. The e-liquids tested were found to contain humectants, glycerin and/or propylene glycol, (⩾75% content); water (<20%); nicotine (approximately 2%); and flavor (<10%). The aerosol collected mass (ACM) of the e-cigarette samples was similar in composition to the e-liquids. Aerosol nicotine for the e-cigarette samples was 85% lower than nicotine yield for the conventional cigarettes.
Analysis of the smoke from conventional cigarettes showed that the mainstream cigarette smoke delivered approximately 1,500 times more harmful and potentially harmful constituents (HPHCs) tested when compared to e-cigarette aerosol or to puffing room air. The deliveries of HPHCs tested for these e-cigarette products were similar to the study air blanks rather than to deliveries from conventional cigarettes; no significant contribution of cigarette smoke HPHCs from any of the compound classes tested was found for the e-cigarettes. Thus, the results of this study support previous researchers’ discussion of e-cigarette products’ potential for reduced exposure compared to cigarette smoke.
Over at the Defense Advanced Research Projects Agency, also known as DARPA, there are some pretty amazing (and often top-secret) things going on. But one notable component of a DARPA project was revealed by a Defense Department official at a recent forum, and it is the stuff of science fiction movies. According to DARPA Director Arati Prabhakar, a paralyzed woman was successfully able use her thoughts to control an F-35 and a single-engine Cessna in a flight simulator.
It's just the latest advance for one woman, 55-year-old Jan Scheuermann, who has been the subject of two years of groundbreaking neurosignaling research. First, Scheuermann began by controlling a robotic arm and accomplishing tasks such as feeding herself a bar of chocolate and giving high fives and thumbs ups. Then, researchers learned that -- surprisingly -- Scheuermann was able to control both right-hand and left-hand prosthetic arms with just the left motor cortex, which is typically responsible for controlling the right-hand side. After that, Scheuermann decided she was up for a new challenge, according to Prabhakar.
"Jan decided that she wanted to try flying a Joint Strike Fighter simulator," Prabhakar said, prompting laughter from the crowd at the New America Foundation's Future of War forum. "So Jan got to fly in the simulator."
Unlike pilots who use the simulator technology for training, Scheuermann wasn't thinking about controlling the plane with a joystick. She thought about flying the plane itself -- and it worked. "In fact," Prabhakar noted, "for someone who's never flown -- she's not a pilot in real life -- she's in there flying a simulator directly from neurosignaling."
Scheuermann has been paralyzed since 2003 because of a neurodegenerative condition. In 2012, she agreed to be fitted with two probes on the surface of her brain in the motor cortex area responsible for right hand and arm movements. In the last two years, she has tolerated those probes better than expected; as a result, she's been the subject of increasingly sophisticated experiments in conjunction with the University of Pittsburgh Medical Center and DARPA's Revolutionizing Prosthetics program, to determine just how much she can do simply by thinking about it.
It’s almost April 15, and you may be worrying about how much taxes will hurt this year. But a new study published today suggests there’s a whole world of economic losses in the air around us that few of us know anything about.
The study, published in the journal Climatic Change, is the first to pull together a proper accounting of the hidden costs of greenhouse gas emissions. It shows the true (and much higher) cost that we pay in dollars at the pump and light switch—or in human lives at the emergency room.
Drew Shindell, a professor at Duke University, has attempted to play CPA to our industrialized emitting world. He has tabulated what he calls “climate damages” for a whole range of greenhouse gases like CO2, aerosols, and methane—and more persistent ones like nitrous oxides.
Shindell also estimated the yearly damages from power plants in the U.S. Using coal costs us the most, with climate damages adding an almost 30 cents per kilowatt hour to the current price of 10 cents we now pay. The gas-fired power price rises to 17 cents from 7 cents per kilowatt hour.
For the average homeowner who uses natural gas, your real bill after climate damages is double. And for those of us who get their electricity from coal-fired power plants, our energy bills are really four times what we see in our monthly statements.
Shindell calculates the total yearly emissions price tag—between transportation, electricity, and industrial combustion—at between $330-970 billion. That wide spread depends on the choice of a discount rate, which reflects the relative value of money over the years and decades of climate change to come.
A 2.8 million-year-old jawbone may be the oldest human fossil in existence,according to two papers published simultaneously in Science. Researchers now suspect that Homo (the genus that includes modern humans) dates back at least 400,000 years earlier than previously thought.
For decades scientists have been scouring Africa for ancient human remains. Archaeologists think Homo habilis, the first truly “human-like” primate, lived about 2.5 million years ago, and Lucy, the human-ape mashup who is perhaps our most famous ancestor, lived about 3.2 million years ago.
But this particular fossil, found in the Afar region of Ethiopia and temporarily named LD 350-1, appears to be a new type of Homo that falls right between Lucy and Homo habilis. The fossil’s slim molars and proportionate jaw are hallmarks of habilis, for instance, but its primitive chin looks a lot more like Lucy’s. For now, the researchers are calling their discovery “Homo species indeterminate,” as they still aren’t exactly sure what it is.
Most 2.8 million-year-old fossils are too ancient to date by conventional means, so the researchers sampled volcanic ash above and below the jawbone and then used argon40 dating to determine the age of the eruption that formed each sample. The results give us the youngest and oldest dates that the hominin who owned LD 350-1 could have lived—2.5 and 2.8 million years ago, respectively.
Scientists have discovered that a 150 million year old Stegosaurus stenops specimen would have been similar in weight to a small rhino when it died.
Calculating body mass in animals that have been dead for many millions of years has been difficult for scientists. There are two methods for calculating body mass. One relies on researchers taking measurements of limb bones and extrapolating body mass from a large dataset of living animals, while the other produces a 3D model of the animal and applies densities to body segments to calculate mass. However, both often have varying results.
The researchers from Imperial College London and the Natural History Museum are the first to combine both methods to calculate the body mass of an extinct creature to get an accurate measurement. They used this approach on a Stegosaurus skeleton nicknamed Sophie, which was found in Wyoming in the USA in 2003. They have calculated that the Sophie would have weighed around 1,600 kg, similar in weight to a small rhino.
Dr Susannah Maidment, Junior Research Fellow from the Department of Earth Science and Engineering at Imperial College London, said: “Although the Stegosaurus is something of an iconic dinosaur, scientists know very little about its biology because its fossils are surprisingly rare. We don't actually know whether Sophie was female or male, despite its nickname. When it died, Sophie was a young adult - equivalent to a human teenager. Although there is no evidence for why it died, it seems that the carcass fell into a shallow pond, where it was quickly buried, preventing other animals from scavenging it, and explaining why it is so well preserved.”
An experiment not much bigger than a tabletop, using ultra-cold metal plates, could serve up a cosmic feast. It could give us a glimpse of quantum gravity and so lead to a "theory of everything": one that unites the laws of quantum mechanics, governing the very small, and those of general relativity, concerning the monstrously huge.
Such theories are difficult to test in the lab because they probe such extreme scales. But quantum effects have a way of showing up unexpectedly. In a strange quantum phenomenon known as the Casimir effect, two sheets of metal held very close together in a vacuum will attract each other.
The effect occurs because, even in empty space, there is an electromagnetic field that fluctuates slightly all the time. Placing two metal sheets very close to one another limits the fluctuations between them, because the sheets reflect electromagnetic waves. But elsewhere the fluctuations are unrestricted, and this pushes the plates together.
James Quach at the University of Tokyo suggests that we might be able to observe the equivalent effect for gravity. That would, in turn, be direct evidence of the quantum nature of gravity: the Casimir effect depends on vacuum fluctuations, which are only predicted by quantum physics.
But in order to detect it, you would need something that reflects gravitational waves – the ripples in space-time predicted by general relativity. Earlier research suggested that superconductors (for example, metals cooled to close to absolute zero such that they lose all electrical resistance) might act as mirrors in this way.
"The quantum properties of superconductors may reflect gravitational waves. If this is correct, then the gravitational Casimir effect for superconductors should be large," says Quach. "The experiment I propose is feasible with current technology."
It's still unclear if superconductors actually reflect gravitational waves, however. "The exciting part of this paper has to do with a speculative idea about gravitational waves and superconductors," says Dimitra Karabali at Lehman College in New York. "But if it's right, it's wonderful."
Rain and snow have graced the West recently, causing many residents to breathe a sigh of relief about possible easing of the severe drought conditions that have worsened there over the past three-plus years. Complacency about drought and climate change is not warranted, say Dr. Noah Diffenbaugh and his research team from Stanford.
In “Anthropogenic warming has increased drought risk in California,” an article just published online today by the Proceedings of the National Academy of Sciences, Diffenbaugh and colleagues reveal proof of a somewhat counterintuitive hypothesis: higher temperatures, not necessarily precipitation shortages, drive the phenomenon of drought.
Diffenbaugh heads the Climate and Earth System Dynamics research group in the School of Earth, Energy & Environmental Sciences at Stanford, where he’s an associate professor and a senior fellow in the Stanford Woods Institute for the Environment. He was behind last September’s conclusions that climate change is occurring 10 times faster now than at any time in the past 65 million years. He has also said that at its current pace, climate change will involve a 5- to 6-degree Celsius rise by 2100.
“Smoking Kills” is more than just a catchy PSA or smoking cessation campaign slogan—it’s verifiable fact. Since the mid-1900s, study after study has generated compelling evidence linking smoking to increased mortality rates. Arguably, the most influential of these is the 1956 publication of smoking data on the “British Doctors Study,” which presented compelling evidence that over half of smokers would eventually die due to smoking-related complications. A new study published in BMC Medicine asserts that this mortality rate may even be as high as 66 percent, meaning that two out of three smokers will eventually die from conditions associated with their smoking.
This study, put together by investigators from the National Centre for Epidemiology and Population Health at the Australian National University, followed 204,953 men and women over 45 years old from New South Wales, Australia. These participants were categorized into groups of smokers, past smokers, and never smokers.
Person-years are a measure of time used in epidemiological studies, in which the years studied for all participants in a study are added together. For example, if three people were studied for 10 years each, 30 total person years would be reported in the study. In the study published in BMC Medicine, a total of 874,120 person-years were examined, and during those person-years, 5,593 deaths occurred among the study population.
Epidemiological outcomes are typically reported in terms of “Relative Risk”, which describes the proportion of the risk of an outcome that can be attributed to a specific factor. In this study, the relative risk of death (known as mortality) for male and female smokers showed that they were approximately 2.76 and 2.95 times more likely to die than never smokers. Quitting helps; male and female past smokers were 1.27 and 1.39 times more likely to die than never smokers. These numbers, while not surprising given the large body of data on the risks of smoking, are nonetheless a staggering reminder of the quantifiable risks of smoking.
For the first time ever, scientists have photographed light behaving simultaneously as both a particle and a wave. The image is a momentous achievement, providing direct observation of both behaviors simultaneously for the first time, after decades of attempts by the scientific community. Previous research projects have successfully observed wave-like behaviors and particle-like behaviors in light, but not at the same time.
The dual behavior of light, which is demonstrated through quantum mechanics and was first proposed by Albert Einstein, was only possible to capture by scientists at École Polytechnique Fédérale de Lausanne (EPFL), Switzerland, due to an unorthodox imaging technique. The scientists generated the image with electrons, making use of EPFL’s ultrafast energy-filtered transmission electron microscope. This gave them a rare advantage over other institutions, as EPFL has one of only two microscopes in the world.
The image was achieved first by firing a pulse of laser light at a miniscule metallic nanowire, adding energy to charged particles in the nanowire and making them vibrate. The light waves travel along the nanowire in opposite directions, like lanes of cars on a road, but when they meet from opposite directions they form a new wave the appears as if it is “standing in place”, effectively confined to the nanowire. This wave, which radiates around the nanowire, was the light source that was imaged.
The scientists fired a stream of electrons in close proximity to the nanowire, and imaged their interaction with this “standing wave”. As they came into contact with the light, their changes in behavior acted as a visualization of the light’s behavior. The electrons that interacted with the light, or photons, either slowed down or sped up, together forming a visualization of the light’s wave. However, the changes in speed also appeared as an exchange of quanta – packets of energy – between the electrons and the photons. These packets were the tell-tale sign of the light behaving as a particle.
Astronomers have reported the discovery of a star that passed within the outer reaches of our Solar System just 70,000 years ago, when early humans were beginning to take a foothold here on Earth. The stellar flyby was likely close enough to have influenced the orbits of comets in the outer Oort Cloud, but Neandertals and Cro Magnons – our early ancestors – were not in danger. But now astronomers are ready to look for more stars like this one.
Lead author Eric Mamajek from the University of Rochester and collaborators report in The Closest Known Flyby Of A Star To The Solar System (published in Astrophysical Journal on February 12, 2015) that “the flyby of this system likely caused negligible impact on the flux of long-period comets, the recent discovery of this binary highlights that dynamically important Oort Cloud perturbers may be lurking among nearby stars.”
The star, named Scholz’s star, was just 8/10ths of a light year at closest approach to the Sun. In comparison, the nearest known star to the Sun is Proxima Centauri at 4.2 light years. At present, Scholz’s star is 20 light years away, one of the 70 closest stars to our Solar System. However, the astronomers calculated, with a 98% certainty, that Scholz’s passed within 0.5 light years, approximately 50,000 Astronomical Units (A.U.) of the Sun.
In 1984, the paleontologists David Raup and Jack Sepkoski postulated that a dim dwarf star, now widely known on the internet as the Nemesis Star, was in a very long period Solar orbit. The elliptical orbit brought the proposed star into the inner Solar System every 26 million years, causing a rain of comets and mass extinctions on that time period. By no coincidence, because of the sheer numbers of red dwarfs throughout the galaxy, Scholz’s star nearly fits such a scenario. Nemesis was proposed to be in a orbit extending 95,000 A.U. compared to Scholz’s nearest flyby distance of 50,000 A.U. Recent studies of impact rates on Earth, the Moon and Mars have discounted the existence of a Nemesis star (see New Impact Rate Count Lays Nemesis Theory to Rest, Universe Today, 8/1/2011).
But Scholz’s star — a real-life Oort Cloud perturber — was a small red dwarf star star with a M9 spectral classification. M-class stars are the most common star in our galaxy and likely the whole Universe, as 75% of all stars are of this type. Scholz’s is just 15% of the mass of our Sun. Furthermore, Scholz’s is a binary star system with the secondary being a brown dwarf of class T5. Brown Dwarfs are believed to be plentiful in the Universe but due to their very low intrinsic brightness, they are very difficult to discover … except, as in this case, as companions to brighter stars.
The astronomers reported that their survey of new astrometric data of nearby stars identified Scholz’s as an object of interest. The star’s transverse velocity was very low, that is, the stars sideways motion. Additionally, they recognized that its radial velocity – motion towards or away from us, was quite high. For Scholz’s, the star was speeding directly away from our Solar System. How close could Scholz’s star have been to our system in the past?
Scholz’s star is an active star and the researchers added that while it was nearby, it shined at a dimly of about 11th magnitude but eruptions and flares on its surface could have raised its brightness to visible levels and could have been seen as a “new” star by primitive humans of the time.Scholz’s star is an active star and the researchers added that while it was nearby, it shined at a dimly of about 11th magnitude but eruptions and flares on its surface could have raised its brightness to visible levels and could have been seen as a “new” star by primitive humans of the time.
In our solar system, there are only two large rocky worlds, Venus and Earth. Mercury and Mars are small enough that both lost most of their internal heat billions of years ago and they have largely ceased to further evolve. (The ancient, preserved, surface of Mars is what makes it so attractive to explore for the types of habitable environments that were long ago erased from the Earth’s surface.) Both Venus and the Earth, however, retain substantial heat in their cores. That heat drives plate tectonics on our world and appears to have caused the near global resurfacing of Venus in the last few hundred millions of years, which counts for recent when compared to the age of the solar system.
While Venus and Earth have similar sizes and are solar system neighbors, they have evolved very differently. Venus today lacks oceans, appears to lack plate tectonics, and has a massive carbon dioxide atmosphere that creates a greenhouse effect that makes the surface extremely hot.
Understanding why Venus and Earth became so different will help us understand why Earth evolved as it has and what the range of conditions for similarly sized worlds around other stars may be. Venus provides the contrast to the Earth that can help us both better understand the origins of our world’s characteristics and the range of possibilities for similar sized planets orbiting other stars.
Today, our knowledge of Venus’ surface and its interior is similar to our knowledge of Mars in the 1970s following the Viking mission. The Soviet Union placed several probes on the surface that made simple measurements in the hour or so before the surface heat fried their electronics. NASA’s Magellan spacecraft mapped the surface with radar in the early 1990s at about 120 m resolution globally. We know, however, from our experiences mapping the Moon and Mars’ surfaces that teasing out the details of geologic processes requires mapping surfaces with resolutions less than 50 m resolution with smaller areas mapped at a few meters resolution.
Mapping Venus’ surface (with one exception we’ll return to later) requires using imaging radars that can penetrate its thick cloud cover. The technology in the early 1990s when Magellan flew was relatively new and crude by today’s standards. Now imaging radars are widely used to study the earth both from airplanes and from satellites. The technology is mature and relatively low cost.
As a result, something of a cottage industry has grown up proposing new missions to map Venus either through the European Space Agency’s Medium Class program or through NASA’s Discovery program. The different accounting rules applied by the two agencies make direct cost comparisons difficult, but these missions cost in the neighborhood of $500M to $600M. A Venus radar mapping mission has been proposed for the current ESA Medium Class competition, and I hear that up to three missions are in competition for selection through the NASA program.
The European selection process tends to be more open than the U.S. process, and the EnVision team led by Dr. Richard Ghail at Imperial College London shared a copy of their proposal to ESA with me. The EnVision mission would address several key questions:
Billions of years ago, a huge primitive ocean covered one-fifth of the red planet’s surface, making it warm, wet and ideal for alien life to gain a foothold, Nasa scientists say.
The huge body of water spread over a fifth of the planet’s surface, as great a portion as the Atlantic covers the Earth, and was a mile deep in places. In total, the ocean held 20 million cubic kilometers of water, or more than is currently found in the Arctic Ocean on Earth, the researchers found.
Unveiled by Nasa on Thursday, the compelling evidence for the primitive ocean adds to an emerging picture of Mars as a warm and wet world in its youth, which trickled with streams, winding river deltas, and long-standing lakes, soon after it formed 4.5 billion years ago.
The view of the planet’s ancient history radically re-writes what many scientists believed only a decade ago. Back then, flowing water was widely considered to have been a more erratic presence on Mars, gushing forth only rarely, and never forming long-standing seas and oceans.
“A major question has been how much water did Mars actually have when it was young and how did it lose that water?” said Michael Mumma, a senior scientist at Nasa Goddard Space Flight Center in Maryland.
Writing in the journal, Science , the Nasa team, and others at the European Southern Observatory (ESO) in Munich, provide an answer after studying Mars with three of the most powerful infra-red telescopes in the world.
The scientists used the Keck II telescope and Nasa’s Infrared Telescope Facility, both in Hawaii, and the ESO’s Very Large Telescope in Chile, to make maps of the Martian atmosphere over six years. They looked specifically at how different forms of water molecules in the Martian air varied from place to place over the changing seasons.
It’s “Groundhog Day” in the cosmos. This is the first time astronomers have been able to see the same explosion over and over again, and its unique properties may help them better understand not only the nature of these spectacular phenomena but also cosmological mysteries like dark matter and how fast the universe is expanding. Astronomers using the Hubble Space Telescope say they have been watching the same star blow itself to smithereens in a supernova explosion over and over again, thanks to a trick of Einsteinian optics.
The star exploded more than nine billion years ago on the other side of the universe, too far for even the Hubble to see without special help from the cosmos. In this case, however, light rays from the star have been bent and magnified by the gravity of an intervening cluster of galaxies so that multiple images of it appear.
Four of them are arranged in a tight formation known as an Einstein Cross surrounding one of the galaxies in the cluster. Since each light ray follows a different path from the star to here, each image in the cross represents a slightly different moment in the supernova explosion.
“I was sort of astounded,” said Patrick Kelly of the University of California, Berkeley, who discovered the supernova images in data recorded by the space telescope in November. “I was not expecting anything like that at all.” Dr. Kelly is lead author of a report describing the supernova published recently in Science.
Robert Kirshner, a supernova expert at the Harvard-Smithsonian Center for Astrophysics who was not involved in the work, said: “We’ve seen gravitational lenses before, and we’ve seen supernovae before. We’ve even seen lensed supernovae before. But this multiple image is what we have all been hoping to see.”
Saturn is orbited by 62 official moons, the largest of which is Titan. However, Titan is not your average satellite - larger than the planet Mercury, Titan has a thick nitrogen atmosphere and a large liquid hydrocarbon lakes on the surface. Unfortunately, it has been difficult to obtain much information about the lakes’ depth or composition from the orbital missions. NASA has recently revealed what a conceptual submarine mission to Kraken Mare, the largest sea on Titan, would look like. Kraken Mare contains enough liquid methane to fill Lake Michigan three times over. Conditions are presumed to be rough, with changing tides and massive waves.
The hypothetical submarine would travel about 2,000 kilometers (1,250 miles) over the course of a 90 day mission. While the craft wouldn’t have a problem staying under the sea during that time and diving, it will need to surface in order to transmit data back to Earth. It would be powered by a radioisotope thermoelectric generator which doesn’t have moving parts, making it a good choice for a craft with such a long journey and will be dropped into the sea. Most of the power will be used to propel the submarine while under the surface, but will be capable of performing science missions as well.
During the mission, the submarine would make a number of observations and collect data using a variety of instruments. Some of the main objectives would be to analyze the chemical composition of the liquid, but also other oceanographic features such as currents and tidal patterns. The craft would also be equipped with cameras in order to image Titan’s shoreline and landscape. The science goals are pretty vague at such an early juncture, but would be more refined and detailed if the mission planning continues.
There are a number of technological and logistical obstacles to address before any proposed launch dates are developed, including Titan’s orbit around Saturn. It takes nearly 30 Earth years for Titan to revolve around the planet, which will influence when such a mission could take place.
Since 1987, the Bulletin of the Atomic Scientists has been counting up each country's nuclear arsenal in its Nuclear Notebook, peeling back the veil of secrecy that often surrounds these numbers. The Bulletin has now gone and made its Nuclear Notebook into a neat interactive graphic.
There are nine nuclear states: the U.S., Russia, the United Kingdom, France, China, India, Pakistan, Israel, and North Korea. The 70 years worth of data isn't necessarily surprising, but it really drives home how the world's nuclear arsenal is completely dominated by the U.S. and Russia. The other countries barely register as a blip. The full interactive graphic lets you sift through the data country by country and year by year. Check it out at the Bulletin's website.
Deer aren't the slim, graceful vegans we thought they were. Scientists using field cameras have caught deer preying on nestling song birds. And it's not just deer. Herbivores the world over may be supplementing their diets.
When researchers in North Dakota set up "nest cams" over the nests of song birds, they expected to see a lot of nestlings and eggs get taken by ground squirrels, foxes, and badgers. Squirrels hit thirteen nests, but other meat-eaters made a poor showing. Foxes and weasels only took one nest each. Know what fearsome animal out-did either of those two sleek, resourceful predators?
White-tailed deer. These supposed herbivores placidly ate living nestlings right out of the nest. And if you're thinking that it must be a mistake, that the deer were chewing their way through some vegetation and happened to get a mouthful of bird, think again. Up in Canada, a group of ornithologists were studying adult birds. In order to examine them closely, the researchers used "mist-nets." These nets, usually draped between trees, are designed to trap birds or bats gently so they could be collected, studied, and released. When a herd of deer came by, they deer walked up to the struggling birds and ate them alive, right out of the nets.
This behavior is not limited to one species or one continent. Last year, a farmer in India made a video of a cow eating a recently-hatched chick. Some scientists speculate that herbivores turn to meat when they're not getting enough nutrients in their diet. It's possible. A biologist in Scotland documented red deer eating seabird chicks, and concluded it was how they got the dietary boost necessary to grow their antlers. The same researcher also documented sheep eating the heads and legs off of seabird chicks. And then there's another cow in India, which reportedly ate fifty chickens. There may be a specific need that drives herbivores to occasionally eat meat. It's also possible, experts say, that eating meat, when it can't run away from them, is just something supposed "herbivores" do, and we're finally getting wise to it.
The California company EnChroma is creating lenses that allow some to see colors for the first time. Colorblindness is just the latest problem that scientists have tried to solve with a technical fix. They’ve modified the DNA of plants such as corn to resist pests and fight disease, and now are building electronic bees to pollinate them. Drugs let antsy children concentrate in class and help depressed adults feel balanced. Cochlear implants help the deaf hear, and mechanical limbs help athletes win Olympic medals.
It is no surprise, then, that scientists have made breakthroughs with colorblindness, which is the most common congenital disorder in humans: More than 15 million people in the U.S. and over 300 million worldwide don’t see normal colors. Most are men who inherit it from their mothers’ fathers.
Despite how common this condition is, most people don’t understand it. The colorblind are almost all actually red-green colorblind, but that doesn’t mean they can’t see red and green. The colorblind can see the colors when they’re vivid, but make mistakes when they’re faint. And because so many colors such as pink or purple contain just a little bit of red or green, mistakes are common.
It’s treated as a joke, even among the celebrity colorblind. Didn’t you know Mark Zuckerberg made Facebook blue because it’s the easiest color for him to see? If Van Gogh had normal color vision, would his paintings have looked more or less intense? Is defective vision the reason why Bill Clinton has trouble seeing stains? Colorblind men clash ties when they dress, buy unripe bananas for breakfast, and mix up subway lines on their way to work. They get confused by line graphs during meetings, and try to push through the red “occupied” signs on bathroom doors. To a colorblind man, the red lipstick you’re wearing might not be that impressive, but neither will your blemishes.
Based in Berkeley, California, McPherson, who has a PhD in glass science from Alfred University, originally specialized in creating eyewear for doctors to use as protection during laser surgery. Rare earth iron embedded in the glasses absorbed a significant amount of light, enabling surgeons to not only stay safe, but also clearly differentiate between blood and tissue during procedures.
In fact, surgeons loved the glasses so much, they began disappearing from operating rooms. This was the first indication that they could be used outside the hospital. McPherson, too, began casually wearing them, as sunglasses. “Wearing them makes all colors look incredibly saturated,” he says. “It makes the world look really bright.”
It wasn’t until Angell borrowed his sunglasses at the Frisbee game, however, that McPherson realized they could serve a broader purpose and help those who are colorblind. After making this discovery, he spent time researching colorblindness, a condition he knew very little about, and ultimately applied for a grant from the National Institutes of Health to begin conducting clinical trials.
Since then, McPherson and two colleagues, Tony Dykes and Andrew Schmeder, founded EnChroma Labs, a company dedicated to developing everyday sunglasses for the 300 million people in the world with color vision deficiency. They've been selling glasses, with sporty and trendy, Ray-Ban-like frames, since December 2012, at a price point ranging from $325 to $450. The EnChroma team has refined the product significantly, most recently changing the lenses from glass to a much more consumer-friendly polycarbonate in December 2014.
The company’s eyewear is able to treat up to 80 percent of the customers who come to them. The remaining 20 percent, including the writer of this recent Atlantic article, who tested the glasses, are missing an entire class of photopigments, either green or red—a condition EnChroma is not currently able to address.
UCLA life scientists have created an accurate new method to identify genetic markers for many diseases — a significant step toward a new era of personalized medicine, tailored to each person’s DNA and RNA. This powerful new method, called GIREMI (pronounced Gir-REMY), will help scientists to inexpensively identify RNA editing sites, genetic mutations and single nucleotide polymorphisms — tiny variations in a genetic sequence — and can be used to diagnose and predict the risk of a wide range of diseases from cancers to schizophrenia, said Xinshu (Grace) Xiao, senior author of the research and a UCLA associate professor of integrative biology and physiology in the UCLA College.
Details about GIREMI were published March 2 in the advance online edition of the journal Nature Methods. The research was funded by the National Institute of Health and the National Science Foundation. Xiao is making the software available on her website as a free download, enabling scientists worldwide to use this potent method in their own research on any number of diseases. President Obama’s budget encourages doctors to design individually tailored treatments based on genetic and molecular differences. This approach, which is called personalized medicine or precision medicine, holds the potential of “delivering the right treatment at the right time, every time, to the right person,” Obama said.
Many genes contain RNA editing sites, which are not yet well understood, but appear to hold clues to many diseases. One might think that whatever is in the DNA we inherited from our parents would eventually be expressed in our proteins, but it turns out there is a modification process, called RNA editing, that can contribute to different types of cancer, autism, Alzheimer’s disease, Parkinson’s and many others, Xiao said.
RNA editing modifies nucleotides, whose patterns carry the data required for constructing proteins, which provide the components of cells and tissues — in our genetic material. If you had an “A” nucleotide in your DNA, for example, it may be modified into a “G.”
RNA editing is different from mutations. A mutation is written incorrectly in our genes. In RNA editing, our genetic material is normal, but modifications occur later when a gene is expressed.
GIREMI was researched and designed during the past two years by Xiao and Qing Zhang, a postdoctoral scholar in her laboratory. It is the most accurate and sensitive method for identifying RNA editing sites, as well as SNPs and mutations in RNA. Differentiating SNPs, most of which appear not to be harmful, from RNA editing sites has been very difficult and previously required sequencing a person’s entire genome.
“We can predict RNA editing sites and SNPs without sequencing the whole genome,” said Xiao, a member of UCLA’s Institute for Quantitative and Computational Biosciences, Molecular Biology Institute and also the Jonsson Comprehensive Cancer Center. “Now you don’t have to spend thousands of dollars sequencing the DNA; you can sequence only the RNA. Our method will be easily applicable to all the existing RNA data sets, and will help to identify SNPs and mutations at a large cost reduction from current methods.”
RNA editing is at an early stage. “We are trying to discover as many editing sites as possible,” said Xiao, whose research group is working to apply GIREMI to many diseases. “This method can be easily applied to any RNA sequencing data sets to discover new RNA editing sites that are specific to a certain disease.”
Many RNA editing sites are specific to the brain, Xiao and Zhang found, indicating RNA editing is involved in brain function and neurological disorders. There are more than 10,000 known RNA editing sites in the brain and probably many more, she said.
People have “abundant differences” in RNA editing sites. Studying 93 people whose RNA has been sequenced, Xiao and Zhang found that each person has unique RNA editing sites in their immune system’s lymphoblast cells, which are precursors of white blood cells that protect us from infectious diseases and foreign invaders.
RNA has been widely known as a cellular messenger that makes proteins and carries out DNA’s instructions to other parts of the cell, but is now understood to perform sophisticated chemical reactions and is believed to perform an extraordinary number of other functions, at least some of which are unknown.
Analysis of the genomes of 69 ancient Europeans has revealed that herders moved en masse from Russia into Central Europe around 4,500 years ago. These migrants may be responsible for the expansion of Indo-European languages, which make up the majority of spoken tongues in Europe today.
Data from the genomes of 69 ancient individuals suggest that herders moved en masse from the continent's eastern periphery into Central Europe. These migrants may be responsible for the expansion of Indo-European languages, which make up the majority of spoken tongues in Europe today.
An international team has published the research in the journal Nature. Prof David Reich and colleagues extracted DNA from remains found at archaeological sites around the continent. They used a new DNA-enrichment technique that greatly reduces the amount of sequencing needed to obtain genome-wide data.
Their analyses show that 7,000-8,000 years ago, a closely related group of early farmers moved into Europe from the Near East, confirming the findings of previous studies. The farmers were distinct from the indigenous hunter-gatherers they encountered as they spread around the continent. Eventually, the two groups mixed, so that by 5,000-6,000 years ago, the farmers' genetic signature had become melded with that of the indigenous Europeans.
But previous studies show that a two-way amalgam of farmers and hunters is not sufficient to capture the genetic complexity of modern Europeans. A third ancestral group must have been added to the melting pot more recently.
Prof Reich and colleagues have now identified a likely source area for this later diaspora. The Bronze Age Yamnaya pastoralists of southern Russia are a good fit for the missing third genetic component in Europeans. The team analysed nine genomes from individuals belonging to this nomadic group, which buried their dead in mounds known as kurgans.
The scientists contend that a group similar to the Yamnaya moved into the European heartland after the invention of wheeled vehicles, contributing up to 50% of ancestry in some modern north Europeans. Southern Europeans on the whole appear to have been less affected by the expansion.
India’s air pollution, ranked among the world’s worst, is reducing the life expectancy of over half of the country’s population by more than three years, according to a new study.
Researchers from the University of Chicago, Harvard and Yale wrote in this month’s Economic & Political Weekly that more than 660 million Indians live in areas where fine-particulate matter pollution exceeds levels considered safe by Indian standards. If India reverses this trend to meet standards, those 660 million people would gain about 3.2 years onto their lives—saving a total of 2.1 billion life-years.
“India’s focus is necessarily on growth. However for too long, the conventional definition of growth has ignored the health consequences of air pollution,” said Michael Greenstone, an author of the study and director of the Energy Policy Institute at the University of Chicago. “This study demonstrates that air pollution retards growth by causing people to die prematurely. Other studies have also shown that air pollution reduces productivity at work, increases the incidence of sick days and raises health care expenses that could be devoted to other goods.”
The new figures come after World Health Organization estimates showed 13 of the 20 most polluted cities in the world were in India, including the worst-ranked city, Delhi. India has the highest rate of death caused by chronic respiratory diseases anywhere in the world.
Rohini Pande, a study co-author and director of Evidence for Policy Design at the Harvard Kennedy School, said, “The loss of more than two billion life years is a substantial price to pay for air pollution. It is in India’s power to change this in cost-effective ways that allow hundreds of millions of its citizens to live longer, healthier and more productive lives. Reforms of the current form of regulation would allow for health improvements that lead to increased growth.”
Thick armor and jaws packed full of teeth aren't the only defences that alligators and crocodiles have. They also have formidable immune systems and some of the protective molecules that enable this have now been identified. Their discovery in the blood of the American alligator might even pave the way for a new generation of antibiotics.
Crocodilians have existed on Earth for at least 37 million years. Over the course of their evolution, they have developed a very strong defence against infection. "They inflict wounds on each other from which they frequently recover without complications from infection despite the fact that the environments in which they live are less than sterile," says Barney Bishop of George Mason University in Fairfax, Virginia, co-author of the new study.
American alligators have an enviable innate immune system, the "primitive" first line of defence that is shared by all vertebrates. In 2008, chemists in Louisiana found that blood serum taken from the reptiles destroyed 23 strains of bacteria and depleted reserves of the HIV virus. The germ-killing molecules were identified as enzymes that break down a type of lipid.
Although their results have yet to lead to any new antibiotics, enzymes aren't the only pathogen-busting molecules that alligators have up their sleeve. Bishop's group has now identified and isolated peptides known as a CAMPs or cationic antimicrobial peptides. These molecules are positively charged so the team developed nanoparticles to electrostatically pick them out of the complex mix of proteins in alligator blood plasma.
In total, the group fished out 45 peptides. Of these, they chemically synthesised eight and evaluated their antimicrobial properties. Five killed some of the E.colibacteria they were presented with, while the other three destroyed most of theE.coli and also showed some activity against bacteria including Pseudomonas aeruginosa, which can cause inflammation and sepsis, and Staphylococcus aureus, which can trigger skin infections, sinusitis and food poisoning. So far, the strains have performed well, says Bishop. Identifying novel antimicrobial peptides is urgently needed because of the growing problem of antibiotic resistance, says Guangshun Wang at the University of Nebraska Medical Center in Omaha. "Because of the novelty of the sequences," he says, "these peptides provide new templates for developing antimicrobials to combat superbugs."
Scientists first observed Saturn’s auroras in 1979. Decades later, these shimmering ribbons of light still fascinate. For one thing they’re magnificently tall, rising hundreds of miles above the planet’s poles. And unlike on Earth where bright displays fizzle after only a few hours, auroras on Saturn can shine for days. Auroras are produced when speeding particles accelerated by the sun’s energy collide with gases in a planet’s atmosphere. The gases fluoresce, emitting flashes of light at different wavelengths. Watch the video to see an edge-on view of Saturn’s northern and southern lights courtesy of NASA’s Hubble Space Telescope.
A new super powerful electron microscope that can pinpoint the position of single atoms was unveiled today at the Science and Technology Research Council's Daresbury Laboratory in Cheshire. The microscope will help scientists push boundaries even further in fields such as advanced materials, healthcare and power generation.
The £3.7 million Nion Hermes Scanning Transmission Electron Microscope, one of only three in the world, is housed in the Engineering and Physical Sciences Research Council (EPSRC) SuperSTEM facility at Daresbury.
The microscope not only allows imaging of unprecedented resolution of objects a million times smaller than a human hair, but also analysis of materials. This means that researchers will not only be able to clearly identify the atoms, but observe the strength of the bonds between them. This will improve understanding of their electronic properties when in bulk and how they may perform when used.
Minister for Universities, Science and Cities, Greg Clark, said: "The UK is a world leader in the development and application of STEM (Scanning Transition Electron Microscope) techniques, and this new super-powerful microscope will ensure we remain world-class.
"From developing new materials for space travel to creating a better, cheaper treatment for anaemia, this new super-powerful microscope lets UK scientists examine how materials behave at a level a million times smaller than a human hair. This exciting research will help lead to breakthroughs that will benefit not only our health but the environment too."
Professor Susan Smith, Head of STFC's Daresbury Laboratory, said: "SuperSTEM is home to real world-leading, even Nobel prize winning, research. It will be exciting to see what ground-breaking findings this new microscope will reveal, as it enables our UK academics, and their collaborators within the world-wide scientific community, to expand the frontiers of materials science."