This epic year for science saw the discovery of the Higgs boson and Curiosity’s arrival on Mars, but researchers also felt the sting of austerity.
Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Physicists Alexey Bezryadin, Alfred Hubler, and Andrey Belkin from the University of Illinois at Urbana-Champaign, have demonstrated the emergence of self-organized structures that drive the evolution of a non-equilibrium system to a state of maximum entropy production. The authors suggest MEPP underlies the evolution of the artificial system’s self-organization, in the same way that it underlies the evolution of ordered systems (biological life) on Earth. The team’s results are published in Nature Publishing Group’s online journal Scientific Reports.
MEPP may have profound implications for our understanding of the evolution of biological life on Earth and of the underlying rules that govern the behavior and evolution of all nonequilibrium systems. Life emerged on Earth from the strongly nonequilibrium energy distribution created by the Sun’s hot photons striking a cooler planet. Plants evolved to capture high energy photons and produce heat, generating entropy. Then animals evolved to eat plants increasing the dissipation of heat energy and maximizing entropy production.
In their experiment, the researchers suspended a large number of carbon nanotubes in a non-conducting non-polar fluid and drove the system out of equilibrium by applying a strong electric field. Once electrically charged, the system evolved toward maximum entropy through two distinct intermediate states, with the spontaneous emergence of self-assembled conducting nanotube chains.
In the first state, the “avalanche” regime, the conductive chains aligned themselves according to the polarity of the applied voltage, allowing the system to carry current and thus to dissipate heat and produce entropy. The chains appeared to sprout appendages as nanotubes aligned themselves so as to adjoin adjacent parallel chains, effectively increasing entropy production. But frequently, this self-organization was destroyed through avalanches triggered by the heating and charging that emanates from the emerging electric current streams. (Watch the video.)
“The avalanches were apparent in the changes of the electric current over time,” said Bezryadin.
Researchers at the University of California, San Diego Skaggs School of Pharmacy and Pharmaceutical Sciences used information collected from hundreds of skin swabs to produce three-dimensional maps of molecular and microbial variations across the body. These maps provide a baseline for future studies of the interplay between the molecules that make up our skin, the microbes that live on us, our personal hygiene routines and other environmental factors. The study, published March 30 by Proceedings of the National Academy of Sciences, may help further our understanding of the skin's role in human health and disease.
"This is the first study of its kind to characterize the surface distribution of skin molecules and pair that data withmicrobial diversity," said senior author Pieter Dorrestein, PhD, professor of pharmacology in the UC San Diego Skaggs School of Pharmacy. "Previous studies were limited to select areas of the skin, rather than the whole body, and examined skin chemistry and microbial populations separately."
To sample human skin nearly in its entirety, Dorrestein and team swabbed 400 different body sites of two healthy adult volunteers, one male and one female, who had not bathed, shampooed or moisturized for three days. They used a technique called mass spectrometry to determine the molecular and chemical composition of the samples. They also sequenced microbial DNA in the samples to identify the bacterial species present and map their locations across the body. The team then used MATLAB software to construct 3D models that illustrated the data for each sampling spot.
Despite the three-day moratorium on personal hygiene products, the most abundant molecular features in the skin swabs still came from hygiene and beauty products, such as sunscreen. According to the researchers, this finding suggests that 3D skin maps may be able to detect both current and past behaviors and environmental exposures. The study also demonstrates that human skin is not just made up of molecules derived from human or bacterial cells. Rather, the external environment, such as plastics found in clothing, diet, hygiene and beauty products, also contribute to the skin's chemical composition.
The maps now allow these factors to be taken into account and correlated with local microbial communities.
Scientists based out of the University of Alberta have -- for the first time -- imaged a joint cracking in real time, effectively putting to rest a decades-long debate in the process. They revealed their success in the journal PLoS ONE. Doubtless you've experienced the physiological wonder that is a cracking knuckle. The audible pop it makes can sometimes be heard across an entire room, making many bystanders wince. But they probably have nothing to cringe about. While joint cracking may sound painful, it's not associated with any adverse health effects -- arthritis, for example.
Everyone knows that bending or stretching a joint is what causes it to crack, but what's going on under the skin? First off, a joint is where two bones meet. At the ends of each bone is soft, cushioning cartilage. Connecting the cartilage -- and thus the bones -- is a synovial membrane that's filled with a thick, lubricating fluid. Bending the joint can cause the membrane to stretch, which in turn causes the pressure inside it to drop and a bubble of dissolved gas to form within the fluid. The whole process is called tribonucleation.
“It’s a little bit like forming a vacuum,” says Professor Greg Kawchuk, the lead researcher. “As the joint surfaces suddenly separate, there is no more fluid available to fill the increasing joint volume, so a cavity is created...” For decades, prevailing wisdom has held that the popping noise is tied to these bubbles, but scientists have debated whether the sound is caused by the bubble's formation or its collapse. Thanks to Kawchuk and his team, we now know it's the former. When they watched a volunteer's knuckles crack inside an MRI machine in real time, the pop clearly occurred when the bubble formed. Moreover, the bubble persisted well after the sound was heard
Researchers at the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) have found a new way of manipulating the walls that define magnetic domains (uniform areas in magnetic materials) and the results could one day revolutionize the electronics industry, they say. Gong Chen and Andreas Schmid, experts in electron microscopy with Berkeley Lab’s Materials Sciences Division, led the discovery of a technique by which the “spin textures” of magnetic domain walls in ultrathin magnets can be switched between left-handed, right-handed, cycloidal, helical and mixed structures.
The “handedness” or “chirality” of spin texture determines the movement of a magnetic domain wall in response to an electric current, so this technique, which involves the strategic application of uniaxial strain, should lend itself to the creation of domains walls designed for desired electronic memory and logic functions.
“The information sloshing around today’s Internet is essentially a cacophony of magnetic domain walls being pushed around within the magnetic films of memory devices,” says Schmid. “Writing and reading information today involves mechanical processes that limit reliability and speed. Our findings pave the way to use the spin-orbit forces that act upon electrons in a current to propel magnetic domain walls either in the same direction as the current, or in the opposite direction, or even sideways, opening up a rich new smorgasbord of possibilities in the field of spin-orbitronics.”
The study was carried out at at the National Center for Electron Microscopy (NCEM), which is part of the Molecular Foundry, a DOE Office of Science User Facility. The results have been reported in a Nature Communications paper titled “Unlocking Bloch-type chirality in ultrathin magnets through uniaxial strain.”
In addition to carrying a negative electrical charge, electrons also carry a quantum mechanical property known as “spin,” which arises from tiny magnetic fields created by their rotational momentum. For the sake of simplicity, spin is assigned a direction of either “up” or “down.” Because of these two properties, a flow of electrons creates both charge and spin currents. Charge currents are well understood and serve as the basis for today’s electronic devices. Spin currents are just beginning to be explored as the basis for the emerging new field of spintronics. Coupling the flows of charge and spin currents together opens the door to yet another new field in electronics called “spin-orbitronics.” The promise of spin-orbitronics is smaller, faster and far more energy efficient devices through solid-state magnetic memory.
The key to coupling charge and spin currents lies within magnetic domains, regions in a magnetic material in which all of the spins of the electrons are aligned with one another and point in the same direction – up or down. In a magnetic material containing multiple magnetic domains, individual domains are separated from one another by narrow zones or “walls” that feature rapidly changing spin directions.
Applying a technique called “SPLEEM,” for Spin-Polarized Low Energy Electron Microscopy, to a thin-film of iron/nickel bilayers on tungsten, Chen and Schmid and their collaborators were able to stabilize domain walls that were a mixture of Bloch and Neel types. They also showed how the chirality of domain walls can be switched between left-and right-handedness. This was accomplished by controlling uniaxial strain on the thin films in the presence of an asymmetric magnetic exchange interaction between neighboring electron spins.
“Depending on their handedness, Neel-type walls are propelled with or against the current direction, while Bloch-type walls are propelled to the left or to the right across the current,” Chen says. “Our findings introduce Bloch-type chirality as a new spin texture and might allow us to tailor the spin structure of chiral domain walls. This would present new opportunities to design spin–orbitronic devices.”
“Magnetization is a 3D vector, not just a scalar property and in order to see spin textures, the three Cartesian components of the magnetization must be resolved,” Schmid says. “Berkeley Lab’s SPLEEM instrument is one of a mere handful of instruments worldwide that permit imaging all three Cartesian components of magnetization. It was the unique SPLEEM experimental capability that made this spin-orbitronics research possible.”
When people think about SETI, the Search for Extraterrestrial Intelligence, they imagine messages sent via radio—Jodie Foster tuning antennas, hoping to pick up signals from the “billions and billions” of star systems pondered by Carl Sagan. Potential extraterrestrials might be beaming out messages into space and all we need to do is listen for them. Of course, even using light—the fastest possible signal-carrier—we would still have to wait years for messages from even the closest stars to reach us. And that presumes that we are listening at just the right moment in the Universe’s 13.8-billion-year history. The odds that different civilizations across the galaxy overlap so precisely that we could listen right now to aliens’s messages could be quite low.
But there’s another potential way to send signals across the cosmos, albeit one that requires a great deal of patience and expertise: storing messages inside genetic material. Encoding a message in an organism or virus and then sending it on an interstellar voyage to other planets requires a long wait—it might take eons for intelligent life to evolve on the destination planet, where the message is waiting—but these organisms, packed with a message, are patient, and they are fecund. They quietly reproduce, carrying and copying the message hidden within their informational backbones. Until one day a sentient being takes notice of the momentous message all around it (or even within itself): “You are not alone.”
It’s natural for many of us to look for secrets, whether we hope for hidden passageways behind bookcases or a map scrawled on the back of the Declaration of Independence, as Nicolas Cage would have it. As pattern-seeking creatures, we perceive patterns, connections, and structures even where there are none, a phenomenon known as apophenia.
The geneticist George Church has shown that information can be easily stored in DNA: He and his team used a DNA microchip to write an entire book 70 billion times into DNA. They use a single nucleotide for each bit of the information, encoded in binary, similar to how the researchers looking at the bacteriophage had suspected a smart alien might encode information.
DNA is stable at room temperature and could last for a long time, potentially making it a good information-storage mechanism. But here’s the hitch, as Church explains: “We purposefully avoided living cells. In an organism, your message is a tiny fraction of the whole cell, so there’s a lot of wasted space. But more importantly, almost as soon as a DNA goes into a cell, if that DNA doesn’t earn its keep, if it isn’t evolutionarily advantageous, the cell will start mutating it, and eventually the cell will completely delete it.”
It’s one thing to have a chunk of a DNA crystal sitting around somewhere for tens of thousands of years. But that chunk, no matter how stable, is small. If you want it to really work as a vehicle for a message, it needs to reproduce and spread. So you need to put it into something living. But by doing that, if the message isn’t also useful for the organism, it will quickly mutate into oblivion.
Nevertheless, if these messages, hidden in some terrestrial DNA, do exist, it will take time and patience to determine this. Perhaps, as Davies argued, it is time for a massive effort that combs through all the digitized genomic information we have generated. An open-source project that does just this would be exciting for those interested in both astronomy and biology, whether or not it yields any sort of finding.
But one thing is likely. Any message that is found will be discovered in the regions most precious to our survival: the ones that ensure we are able to pass our genes on to the next generation. Anything less and it would have likely been swept away long ago by the constant churn of evolution.
Physicists have managed to make homogenous cylindrical objects completely invisible in the microwave range. Contrary to the now prevailing notion of invisibility that relies on metamaterial coatings, the scientists achieved the result using a homogenous object without any additional coating layers. The method is based on a new understanding of electromagnetic wave scattering.
The scientists studied light scattering from a glass cylinder filled with water. In essence, such an experiment represents a two-dimensional analog of a classical problem of scattering from a homogeneous sphere (Mie scattering), the solution to which is known for almost a century. However, this classical problem contains unusual physics that manifests itself when materials with high values of refractive index are involved. In the study, the scientists used ordinary water whose refractive index can be regulated by changing temperature.
As it turned out, high refractive index is associated with two scattering mechanisms: resonant scattering, which is related to the localization of light inside the cylinder, and non-resonant, which is characterized by smooth dependence on the wave frequency. The interaction between these mechanisms is referred to as Fano resonances. The researchers discovered that at certain frequencies waves scattered via resonant and non-resonant mechanisms have opposite phases and are mutually destroyed, thus making the object invisible.
The work led to the first experimental observation of an invisible homogeneous object by means of scattering cancellation. Importantly, the developed technique made it possible to switch from visibility to invisibility regimes at the same frequency of 1.9 GHz by simply changing the temperature of the water in the cylinder from 90 °C to 50 °C.
"Our theoretical calculations were successfully tested in microwave experiments. What matters is that the invisibility idea we implemented in our work can be applied to other electromagnetic wave ranges, including to the visible range. Materials with corresponding refractive index are either long known or can be developed at will," said Mikhail Rybin, first author of the paper and senior researcher at the Metamaterials Laboratory in ITMO University.
The discovery of invisibility phenomenon in a homogenous object and not an object covered with additional coating layers is also important from the engineering point of view. Because it is much easier to produce a homogeneous cylinder, the discovery could prompt further development of nanoantennas, wherein invisible structural elements could help reduce disturbances. For instance, invisible rods could be used as supports for a miniature antenna complex connecting two optical chips.
Mars should be too cold to support liquid water at the surface, but salts in the soil lower its freezing point - allowing briny films to form. The results lend credence to a theory that dark streaks seen on features such as crater walls could be formed by flowing water. The results are published in the journal Nature Geoscience.
Scientists think thin films of water form when salts in the soil, called perchlorates, absorb water vapor from the atmosphere. The temperature of these liquid films is about -70C - too cold to support any of the microbial life forms that we know about. Forming in the top 15cm of the Martian soil, the brines would also be exposed to high levels of cosmic radiation - another challenge to life. But it's still possible that organisms could exist somewhere beneath the surface on Mars, where conditions are more favorable.
The researchers drew together different lines of evidence from the suite of instruments carried by the Curiosity rover. The Rover Environmental Monitoring System (REMS) - essentially the vehicle's weather station - measured the relative humidity and temperature at the rover's landing site of Gale Crater. Scientists were also able to estimate the subsurface water content using data from an instrument called Dynamic Albedo of Neutrons (DAN). These data were consistent with water in the soil being bound to perchlorates. Finally, the Sample Analysis at Mars (SAM) instrument gave the researchers the content of water vapor in the atmosphere.
The results show conditions were right for the brines to form during winter nights at the Martian equator, where Curiosity landed. But the liquid evaporates during the Martian day when temperatures rise. Javier Martin-Torres, a co-investigator on the Curiosity mission and lead scientist on REMS, told BBC News the detection was indirect but convincing: "What we see are the conditions for the formation of brines on the surface. It's similar to when people were discovering the first exoplanets. "They were not seeing the planets, but they were able to see the gravitational effects on the star. "These perchlorate salts have a property called deliquescence. They take the water vapour from the atmosphere and absorb it to produce the brines."
An otherwise healthy 65-year-old woman developed a relentless burning sensation in her mouth that stumped doctors and dentists for months before its strange cause was found, according to a recent report of her case. The burning got worse whenever the woman brushed her teeth but subsided within 10 minutes. The pain went away after one month after she first experienced it, but then returned a year later and remained constant. She saw a dentist, an oral surgeon and her family doctor, but none of them could find any lesions in the mouth or other possible causes of the burning.
They prescribed mouthwashes, milk-of-magnesia rinses and anti-anxiety drugs, and recommended avoiding toothpaste with whitening agents. But nothing relieved the burning sensation. The woman had a case of a condition called "burning mouth syndrome," which is a chronic, burning sensation inside the mouth, usually in the lips, tongue or palate, according to the study, published April 1 in the journal BMJ Case Reports.
"It's common in postmenopausal women, and affects up to 7 percent of the general population," said study co-author Dr. Maria Nagel, a neuro-virologist and professor at the University of Colorado School of Medicine in Aurora. Nagel compared the feeling to a "sunburn inside the mouth," adding that it feels similar to the pain caused by a tooth infection or a root canal.
The condition can be a side effect of certain drugs, but other cases have no apparent medical or dental cause, Nagel said. After the woman had experienced this pain for six months, doctors tested her saliva for the virus that causes oral herpes, the herpes simplex virus type 1 (HSV-1). The virus commonly causes cold sores around the mouth and lips, but the woman didn't have any cold sores.
The tests showed that the woman's saliva was swarming with the infectious particles. "If she'd had cold sores, it would have been obvious," Nagel told Live Science. "Most people don't think of HSV-1 as the potential cause of burning mouth syndrome, so they don't test for it. But it's easily treatable with antiviral medication," she said.
The woman began taking an antiviral drug, and her pain disappeared within five days. Follow-up tests of her saliva — done four weeks later, and again six months later — found no hint of the virus. A year and a half after finishing her treatment, the patient remains pain-free, researchers said.
Researchers from Durham University and the University of São Paulo-USP have developed a method of using single-walled carbon nanotube (SWCNT) composites in “unconventional” computing. By studying the mechanical and electrical properties of the materials, they discovered a correlation between SWCNT concentration /viscosity/ conductivity and the computational capability of the composite.
“Instead of creating circuits from arrays of discrete components (transistors in digital electronics), our work takes a random disordered material and then ‘trains’ the material to produce a desired output,” said Mark K. Massey, research associate, School of Engineering and Computing Sciences at Durham University. This emerging field of research is known as “evolution-in-materio,” a term coined by Julian Miller at the University of York.
It combines materials science, engineering, and computer science. It uses an approach similar to biological evolution: materials can be “trained” to mimic electronic circuits — without needing to design the material structure in a specific way. “The material we use in our work is a mixture of [conducting] carbon nanotubes and [insulating] polymer, which creates a complex electrical structure,” explained Massey.
“When voltages (stimuli) are applied at points of the material, its electrical properties change. When the correct signals are applied to the material, it can be trained or ‘evolved’ to perform a useful function.”
The research “could lead to new techniques for making electronics devices for analog signal processing or low-power, low-cost devices in the future.” The research is describe in a paper in the Journal of Applied Physics.
Scientists have modeled the stunning structure of the receptor in our bodies that jolts our senses when we eat sushi garnished with spicy wasabi -- and it turns out that this so-called 'wasabi receptor' may hold clues for developing new pain treatments.
The receptor, a protein called TRPA1, resides in the cellular membrane of our sensory nerve cells. Not only does it detect certain chemical agents outside of our bodies -- from wasabi to tear gas -- but it also gets triggered by pain-inducing signals within our bodies from itches and inflammation.
“The pain system is there to warn us when we need to avoid things that can cause injury, but also to enhance protective mechanisms,” Dr. David Julius, professor and chair of the physiology department at the University of California, San Francisco, and senior co-author of the new study, said in a written statement. “Of course, this information may also help guide the design of new analgesic drugs.”
The researchers built the new detailed 3D model using an advanced imaging technique known as electron cryo-microscopy, Science magazine reported. Using the model, the researchers discovered a spot in the receptor where wasabi chemical compounds bind, according to a video from UCSF about the research (see above). They noticed that when a receptor encounters such chemical compounds, it activates nerve fibers that then send pain signals to the brain.
There are already a few experimental drugs that target the wasabi receptor to alleviate pain, Smithsonian magazine reported, and the new model allows scientists to see the exact cleft in the protein where those drugs bind -- a discovery which may help guide the development of innovative pain medications.
"A dream of mine is that some of the work we do will translate into medicines people can take for chronic pain," Julius told NPR. "What the structure does is, it gives pharmaceutical firms sort of a map for either tweaking the drugs that they have... or for developing drugs that might have different properties."
The study was published online in the journal Nature on April 8, 2015.
Most recent advances in artificial intelligence—such as mobile apps that convert speech to text—are the result of machine learning, in which computers are turned loose on huge data sets to look for patterns.
To make machine-learning applications easier to build, computer scientists have begun developing so-called probabilistic programming languages, which let researchers mix and match machine-learning techniques that have worked well in other contexts. In 2013, the U.S. Defense Advanced Research Projects Agency, an incubator of cutting-edge technology, launched a four-year program to fund probabilistic-programming research.
At the Computer Vision and Pattern Recognition conference in June, MIT researchers will demonstrate that on some standard computer-vision tasks, short programs—less than 50 lines long—written in a probabilistic programming language are competitive with conventional systems with thousands of lines of code. "This is the first time that we're introducing probabilistic programming in the vision area," says Tejas Kulkarni, an MIT graduate student in brain and cognitive sciences and first author on the new paper. "The whole hope is to write very flexible models, both generative and discriminative models, as short probabilistic code, and then not do anything else. General-purpose inference schemes solve the problems."
By the standards of conventional computer programs, those "models" can seem absurdly vague. One of the tasks that the researchers investigate, for instance, is constructing a 3-D model of a human face from 2-D images. Their program describes the principal features of the face as being two symmetrically distributed objects (eyes) with two more centrally positioned objects beneath them (the nose and mouth). It requires a little work to translate that description into the syntax of the probabilistic programming language, but at that point, the model is complete. Feed the program enough examples of 2-D images and their corresponding 3-D models, and it will figure out the rest for itself.
"When you think about probabilistic programs, you think very intuitively when you're modeling," Kulkarni says. "You don't think mathematically. It's a very different style of modeling." The new work, Kulkarni says, revives an idea known as inverse graphics, which dates from the infancy of artificial-intelligence research. Even though their computers were painfully slow by today's standards, the artificial intelligence pioneers saw that graphics programs would soon be able to synthesize realistic images by calculating the way in which light reflected off of virtual objects. This is, essentially, how Pixar makes movies.
Sea temperatures around Australia are posting "amazing" records that climate specialists say signal global records set in 2014 may be broken this year and next.
March sea-surface temperatures in the Coral Sea region off Queensland broke the previous high by 0.12 degrees – a big jump for oceans that are typically more thermally stable than land. Temperatures for the entire Australian ocean region also set new highs for the month, the Bureau of Meteorology said.
For the Coral Sea region – which includes the entire Great Barrier Reef and stretches from Cape York almost as far south as Brisbane – sea-surface temperatures from January to March were 0.73 degrees above average at 29.16 degrees, making it the warmest three-month period on record, the bureau said. The unusual warmth off Australia comes as the Pacific Ocean remains primed for an El Nino event, as the bureau reported last month.
If such an event occurs, the underlying warming from climate change will get a further boost from natural variability, making 2014's ranking as the hottest year on records going back to the 1880s likely to be short-lived, according to Andy Pitman, head of climate research at the University of NSW. "If we do get an intense El Nino, it will blitz the records," Professor Pitman said. "The climate is on a performance-enhancing drug and that drug is carbon dioxide."
A warm 2015 is very likely, particularly given the El Nino-like conditions in the Pacific, which will provide a significant backdrop to climate change negotiations for a new international treaty in Paris late this year, Professor Pitman said. "If governments turn up in Paris after a series of major climate events, the foundation of their discussions...would be somewhat different than if they turned up in a period of benign climate," he said.
An UNSW-led research team has encoded quantum information in silicon using simple electrical pulses for the first time, bringing the construction of affordable large-scale quantum computers one step closer to reality. The idea is to exploit the advanced fabrication methods developed in semiconductor nanoelectronics and create quantum bits (qubits) that are both highly coherent and easy to control and couple to each other — a challenging task.
The findings were published in the open-access journal Science Advances. The UNSW team, which is affiliated with the ARC Centre of Excellence for Quantum Computation & Communication Technology, was first to demonstrate single-atom spin qubits in silicon, reported in Nature in 2012 and 2013. The team later improved the control of these qubits to an accuracy of above 99% and established the world record for how long quantum information can be stored in the solid state, as published in Nature Nanotechnology in 2014.
The researchers have now demonstrated a key step that had remained elusive since 1998: using electric fields instead of pulses of oscillating magnetic fields. Lead researcher Andrea Morello, a UNSW Associate Professor from the School of Electrical Engineering and Telecommunications, said the method works by distorting the shape of the electron cloud attached to the atom, using a very localized electric field. “This distortion at the atomic level has the effect of modifying the frequency at which the electron responds. “Therefore, we can selectively choose which qubit to operate.”
The findings suggest that it would be possible to locally control individual qubits with electric fields in a large-scale quantum computer using only inexpensive voltage generators, rather than requiring expensive high-frequency microwave sources.
Moreover, this specific type of quantum bit can be manufactured by placing qubits inside a thin layer of specially purified silicon, containing only the silicon-28 isotope. “This isotope is perfectly non-magnetic and, unlike those in naturally occurring silicon, does not disturb the quantum bit,” Morello said.
The purified silicon was provided through collaboration with Keio University in Japan.
A newly developed spectroscopy method is helping to clarify the poorly understood molecular process by which an anti-HIV drug induces lethal mutations in the virus’ genetic material. The findings from the University of Chicago and the Massachusetts Institute of Technology could bolster efforts to develop the next generation of anti-viral treatments.
Viruses can mutate rapidly in order to adapt to environmental pressure. This feature also helps them become resistant to anti-viral drugs. But scientists have developed therapeutic anti-viral agents for HIV, hepatitis C and influenza using a strategy called lethal mutagenesis.
This strategy seeks to extinguish viruses by forcing their already high mutation rates above an intolerable threshold. If viruses experience too many mutations, they can’t properly manage their genetic material.
“They can’t replicate and so are quickly eliminated,” said Andrei Tokmakoff, the Henry G. Gale Distinguished Service Professor in Chemistry at UChicago. “In order to make this work, you need a stealth mutagen. You need something sneaky, something that the virus isn’t going to recognize as a problem.”
Tokmakoff and his associates at UChicago and MIT reported new details of the stealthy workings of the anti-HIV agent KP1212 in March in the Proceedings of the National Academy of Sciences. Supporting data were collected with two-dimensional infrared spectroscopy, an advanced laser technique that combines ultrafast time resolution with high sensitivity to chemical structure.
Scientists design lethally mutagenic molecules such as KP1212 to resemble natural DNA bases, the adenine-thymine, cytosine-guanine base pairs. “These analogs can bind to the wrong base partners and therefore lead to genetic mutations,” said the study’s lead author, Sam Peng, who was a visiting graduate research assistant at UChicago.
KP1212 is a cytosine variation, which normally would pair with guanine during replication. But biochemical experiments and clinical trials have shown that KP1212 induces mutations by pairing with adenine. A leading proposal suggested that KP1212 derived its mutagenicity by shape shifting—converting into a different molecular structure by repositioning its hydrogen atoms on nitrogen and oxygen atoms.
Most experimental tools would have difficulty distinguishing between the normal and shape-shifted structures because they interconvert very rapidly. With two-dimensional infrared spectroscopy, the UChicago team was able to distinguish between the two structures. The team also was able to measure how rapidly the shape shifting occurs under physiological conditions: in 20 billionths of a second.
The research team expected to find only two dominant tautomers, but their experiments showed that many more exist. In addition to taking on different forms as a neutral molecule, KP1212 also could accept an extra proton, giving it a positive charge at physiological levels of acidity—pH of approximately five and a half to seven—that made possible even more rearrangements and tautomer structures.
“The number of possibilities exploded,” Tokmakoff said. The experiments also showed that both the protonated and non-protonated forms facilitated the viral mutation rate. Even in the absence of the protonated form, the virus still mutated, just at a lower rate. “We found that under physiological pHs, KP1212 is significantly protonated and this protonated form induces even higher mutation rates, reaching approximately 50 percent,” Peng said.
A pioneering infrared scan of 100,000 galaxies by Penn State astronomers has failed to detect any signs of galaxy-spanning extraterrestrial supercivilizations. This result, though very preliminary, may be a sign that aliens aren't capable of conquering entire galaxies.
Back in the 1960s, Russian cosmologist Nikolai Kardashev devised the famous scale that now bears his name . He proposed a simple numbering system — from one to three — that can be used to classify hypothetical alien civilizations according to the amount of energy at their disposal. According to the scale, a K1 civ has captured the entire energy output of its home planet, while a K2 civ has tapped into all the power produced by its home star.
But then there are K3 civs — so-called supercivilizations — who have tapped into virtually all of the energy produced by their own galaxy. As study co-author Jason Wright told io9: "Type III civilizations in the sense that Nikolai Kardashev originally defined them, were 'maximal' energy users: they command all of the starlight in their galaxy." This could be accomplished by ETIs in any number of ways, including vast complexes of Dyson spheres and the establishment of Matrioshka Brains.
K3 civilizations should be reasonably easy to detect from a distance. According to fundamental thermodynamics, the energy pulled in by a K3 civ must still be radiated away as heat in the mid-infrared wavelengths. These galactic-scale signatures, though far away, can still be detected from Earth.
With this in mind, a team of astronomers from Penn State recently completed a survey, known as the Glimpsing Heat from Alien Technologies Survey (G-HAT), of 100,000 galaxies to see if they could find traces of galaxy-spanning supercivilizations. Their results now appear in theAstrophysical Journal.
The G-HAT team, led by postbaccalaureate researcher Roger Griffith, analyzed practically the entire catalog of detections made by NASA's WISE orbiting observatory. That's nearly 100 million entries. The researchers honed this list down to ~100,000 of the most promising candidates, looking for objects consistent with galaxies emitting too much mid-infrared radiation.
No obvious alien-filled galaxies were detected. That said, 50 galaxies did feature higher-than-usual levels of mid-infrared radiation. Further analysis will be required to determine if they are caused by some natural astronomic process, or if they're an indication of highly advanced extraterrestrial civilizations not filling up the whole galaxy.
This poses a bit of a problem for astrobiologists. Back in 1975, astronomer Michael H. Hart conjectured that super-advanced aliens should be able to colonize an entire galaxy within a reasonably short amount of time, at least from a cosmological perspective. Either that or humanity is alone in the Milky Way. It's this line of reasoning that led the Penn State researchers to conduct their inter-galactic survey. "There has been a line of argument, originating with Michael Hart, that there should not be any advanced civilization in the Milky Way, because if there were, they would have taken over the entire galaxy by now. If this is correct, then a search for civilizations spanning other galaxies is the best approach.
The next step for the G-HAT team is to scale things down a bit to see if less energy-intensive civilizations might exist in these galaxies. This is where Carl Sagan's adjunct to the Kardashev Scale might come in handy. "On Sagan's scale, we want to push down from Type 3.0 to Type 2.9 or 2.8," says Wright. "That is, search for civilizations using 10% or even 1% of the starlight in a galaxy."
Scientists working in East Africa say they've unearthed the oldest stone tools ever found. They were apparently made 500,000 years before the human lineage evolved. A team led by Sonia Harmand from Stony Brook University in New York found the tools in Kenya, near Lake Turkana. It's an area that's yielded numerous fossils and tools from early humans. These newly discovered tools have been reliably dated to 3.3 million years ago, according to scientists who've reviewed the research. That's 700,000 years older than the previous record for the oldest stone tools ever found.
That's remarkable because it's well before the human genus, Homo, emerged 2.8 million years ago. So clearly these early humans didn't make these tools. The team presumes they were made by an early ancestor of humans, probably a member of a genus called Australopithecus. The famous ape-like creature known as Lucy was from that genus and first appeared in Africa about four million years ago.
Leading stone tool experts who've seen the tools say they have the markings of a process called "knapping." Knapping a piece of stone produces flakes that can have sharp edges and are useful for working with plants, nuts or meat. These flakes can be distinguished from naturally occurring pieces of rock. Knapping also leaves characteristic marks on the rock from which the flakes are chipped.
Richard Potts, head of the Human Origins Program at the Smithsonian Institution, has examined the tools. He tells NPR they're a "mixed bag," with some quite crude and others a little more sophisticated. Potts says they're not as advanced as most early human-made tools, but "there's no doubt it's purposeful" tool-making. And it's more sophisticated than the kind of tool-making that chimpanzees do, he adds, such as shaping sticks to probe for termites in their underground mounds.
Genetically engineered fibers of the protein spidroin — the construction material for spider webs — are a ideal matrix (substrate or frame) for cultivating heart tissue cells, Moscow Institute of Physics and Technology (MIPT) researchers have found, as noted in an open-access article in the journal PLOS ONE.
Regenerative methods can solve the problem of transplant rejection, but it’s a challenge to find a suitable matrix to grow cells on: The material should be non-toxic, elastic, and not rejected by the body or impede cell growth. KurzweilAI has reported on a number of solutions.
Researchers led by Professor Konstantin Agladze, who heads the Laboratory of the Biophysics of Excitable Systems at MIPT, have been cultivating tissues that contract and conduct excitation waves, from cells called cardiomyocytes.
They decided to explore using synthetic electrospun fibers of spidroin as a matrix. They’re light, five times stronger than steel, twice more elastic than nylon, and are capable of stretching a third of their length. Which is why they are currently used as a substrate to grow implants like bones, tendons and cartilages, as well as dressings.
But could they are also function for soft tissues, such as the heart? Agladze decided to find out. His team seeded isolated neonatal rat cardiomyocytes on fiber matrices. Using a microscope and fluorescent markers, the researchers monitored the growth of the cells and tested their contractibility and the ability to conduct electric impulses, which are the main features of normal cardiac tissue. Within three to five days a layer of cells formed on the substrate. They were able to contract synchronously and conduct electrical impulses just like the tissue of a living heart would.
“Cardiac tissue cells successfully adhere to the substrate of recombinant spidroin,” Agladze says. “They grow forming layers and are fully functional, which means they can contract coordinately.”
The American Physical Society is holding its annual April Meeting at the moment in Baltimore, Maryland, and one of the highlights, research-wise, comes to us courtesy of the Dark Energy Survey (DES) collaboration. This afternoon, the researchers released the first in a series of maps of the dark matter that makes up some 23% of all the “stuff” (matter and energy) in our universe. The map was constructed based on data collected by the Dark Energy Camera, the primary instrument of the DES. The camera is perched high on a mountaintop, mounted on a telescope at the Cerro Tololo Inter-American Observatory in Chile, the better to get high-resolution images with minimal interference.
Now in its second year, the DES began taking data on August 31, 2013, with an eye toward better understanding dark matter’s role in the formation of galaxies. The resulting map unveiled today is, as one might expect, spectacular — the first to trace in fine detail how dark matter is distributed across a huge swathe of sky, although it’s a mere 3% of the area the DES will cover by the time it finishes its five-year scheduled run. It’s not the first dark matter map ever, but it’s the largest and highest resolution so far.
The analysis — carried out by a team led by Argonne National Laboratory’s Vinu Vikram and Chihway Change of the Swiss Federal Institute of Technology (ETH) in Zurich — looked at very subtle distortions in the shapes of two million galaxies to construct the map, thanks to a technique called gravitational lensing, whereby the invisible gravitational effects of the dark matter bend light around said galaxies in predictable ways.
And so far, the researchers have found that the distribution of dark matter is pretty well in line with current theories — namely, that because there is significantly more dark matter than visible matter (a mere 4%) in the cosmos, galaxies were formed in those places where there are large concentrations of dark matter, and thus stronger gravity. Think of it as a delicate interplay between mass and light.
You can see that clustering in the color-coded image above, where the blue areas are where the density is about average, and the red and yellow areas depict regions of far greater density — places where there is more dark matter. The circles represent galaxies and galaxy clusters, which do indeed show up more in the higher-density areas. “Zooming into the maps, we have measured how dark matter envelops galaxies of different types and how together they evolve over cosmic time,” Chang said in an official press release. “We are eager to use the new data coming in to make much stricter tests of theoretical models.”
Cement has been called the foundation of modern civilization, the stuff of highways, bridges, sidewalks and buildings of all sizes. But its production comes with a huge carbon footprint. Environmental chemist David Stone was seeking a way to keep iron from rusting when he stumbled upon a possible substitute that requires significantly less energy. Special correspondent Kathleen McCleery reports.
A new rival to the lithium-ion battery has been created that charges in under a minute and still performs almost perfectly after being recharged thousands of times. The new battery is based on aluminium instead of lithium, which should make it both cheaper and safer than their lithium-ion competitors. The US team behind the aluminium-ion battery say that the technology could find its way into the home, help store renewable energy for the power grid and even power vehicles.
The aluminum-ion battery is conceptually similar to the lithium-ion battery: when the battery is discharged atoms from a metal anode are oxidized, releasing electrons into the external circuit. When recharged, the electrons are driven back to the anode.
The aluminum-ion battery offers tantalizing solutions to problems with lithium-ion ones. Aluminium, being the most abundant metal in the Earth's crust, is much cheaper than lithium and is also much less reactive so battery fires are unlikely to be a problem. Ionising aluminium also liberates three electrons compared with lithium's one, potentially giving the batteries a higher charge capacity. But aluminium-ion batteries have been plagued by numerous difficulties: the discharge voltages have often been as low as 0.55V and various cathodes trialled have disintegrated during repeated cycling, causing the lifetimes of the batteries of the batteries to drop to 85% or less within 100 cycles.
Using their carbon foam cathode and ultra-dry electrolyte, the researchers produced a prototype battery with a discharge voltage of around 2V and an energy capacity similar to lead acid and nickel–metal hydride batteries. This battery lost very little of its storage capacity after 7000 cycles, making it far superior even to lithium-ion batteries, which last for about 1000 cycles. Perhaps most remarkably, the battery can safely be completely recharged in less than 60 seconds. This is nearly 100 times faster than the maximum charge rate for a lithium-ion battery. The battery can even be bent and folded safely, and the researchers drilled a hole through it while it was operating without causing a dangerous short circuit.
Dai reveals that commercial companies are interested. He believes the battery is a promising replacement for nickel–metal hydride rechargeable batteries in home appliances and, beyond this, for storing electricity for the grid. At present, he says, the battery's energy density is limited by the bulky AlCl4- ions. 'Hopefully this work can really open up more research in this area,' he adds.
Brain signals can be read using soft, flexible, wearable electrodes that stick onto and near the ear like a temporary tattoo and can stay on for more than two weeks even during highly demanding activities such as exercise and swimming, researchers say. The invention could be used for a persistent brain-computer interface (BCI) to help people operate prosthetics, computers, and other machines using only their minds, scientists add.
For more than 80 years, scientists have analyzed human brain activity non-invasively by recording electroencephalograms (EEGs). Conventionally, this involves electrodes stuck onto the head with conductive gel. The electrodes typically cannot stay mounted to the skin for more than a few days, which limits widespread use of EEGs for applications such as BCIs.
Now materials scientist John Rogers at the University of Illinois at Urbana-Champaign and his colleagues have developed a wearable device that can help record EEGs uninterrupted for more than 14 days. Moreover, their invention survived despite showering, bathing, and sleeping. And it did so without irritating the skin. The two weeks might be "a rough upper limit, defined by the timescale for natural exfoliation of skin cells," Rogers says.
The device consists of a soft, foldable collection of gold electrodes only 300 nanometers thick and 30 micrometers wide mounted on a soft plastic film. This assemblage stays stuck to the body using electric forces known as van der Waals interactions—the same forces that help geckoes cling cling to walls.
The electrodes are flexible enough to mold onto the ear and the mastoid process behind the ear. The researchers mounted the device onto three volunteers using tweezers. Spray-on bandage was used once twice a day to help the electrodes survive normal daily activities.
The electrodes on the mastoid process recorded brain activity while those on the ear were used as a ground wire. The electrodes were connected to a stretchable wire that could plug into monitoring devices. "Most of the experiments used devices mounted on just one side, but dual sides is certainly possible," Rogers says.
A pair of light waves – one zipping clockwise the other counterclockwise around a microscopic track – may hold the key to creating the world’s smallest gyroscope: one a fraction of the width of a human hair. By bringing this essential technology down to an entirely new scale, a team of applied physicists hopes to enable a new generation of phenomenally compact gyroscope-based navigation systems, among other intriguing applications.
“We have found a new detection scheme that may lead to the world's smallest gyroscope,” said Li Ge, The Graduate Center and Staten Island College, City University of New York. “Though these so-called optical gyroscopes are not new, our approach is remarkable both in its super-small size and potential sensitivity.”
Ge and his colleagues – physicist Hui Cao and her student Raktim Sarma, both at Yale University in New Haven, Connecticut – recently published their results in The Optical Society’s (OSA) new high-impact journal Optica.
More than creative learning toys, gyroscopes are indispensable components in a number of technologies, including inertial guidance systems, which monitor an object’s motion and orientation. Space probes, satellites, and rockets continuously rely on these systems for accurate flight control. But like so many other essential pieces of aerospace technology, weight is a perennial problem. According to NASA, it costs about $10,000 for every pound lifted into orbit, so designing essential components that are smaller and lighter is a constant struggle for engineers and project managers.
If the size of an optical gyroscope is reduced to just a fraction of a millimeter, as is presented in the new paper, it could then be integrated into optical circuit boards, which are similar to a conventional electric circuit board but use light to carry information instead of electric currents. This could drastically reduce the equipment cost in space missions, opening the possibility for a new generation of micro-payloads.
As a patient prepares to become the first in history to receive his own genetically engineered stem cells for sickle cell disease, two top scientists are already improving the approach. The team of University of California, Los Angeles (UCLA) immunologist Donald Kohn will soon be adding a healthy version of the hemoglobin, beta (HBB) gene, mutated in sickle cell, to blood stem cells his team will have taken from a patient’s bone marrow, according to Kohn. Then his team will give the corrected stem cells back. Kohn’s team is confident, given that the same approach—adding a gene to cells with one gene mutation—has been succeeding in two of Kohn’s (also pioneering) adenosine deaminase-severe combined immunodeficiency (ADA-SCID) gene therapy trials.
Meanwhile, both he and hematologist Linzhau Cheng of Johns Hopkins University published papers this month that went further, correcting the mutation in sickle cell blood cells. “Both studies for developing next-generation gene correction of sickle cell disease are worthy,” Cheng told Drug Discovery & Development. “These are significant steps forward from the previous gene therapy strategies: using viral vectors to add a copy of genes into hematopoietic stem and progenitor cells.”
Kohn’s team reported on two clinical trials, in which they gave a healthy ADA gene to patients with ADA-SCID, at the end of February at the American Society for Blood and Marrow Transplantation meeting. In a Phase 2 study, patients aged three months to 15 years were given their own blood stem cells back after a healthy version of the ADA gene was introduced to their cells via a retroviral vector. As a result of that 2009 to 2014 trial, nine of ten patients are still off enzyme replacement therapy, and three were able to discontinue intravenous immunoglobulin (IVIg). In May 2013, Kohn started a new ADA gene therapy trial, this time using a next-gen technology, a self-inactivating lentiviral vector considered safer. Eight patients, aged four to 42 months old, have been enrolled. The six who are more than 30 days out are all off enzyme replacement therapy.
In Blood this month, Kohn reported he has been working on a more precise approach: correcting the mutation using zinc finger nucleases (ZFNs). His approach is different from that of Cheng, who reports in Stem Cells he corrected the mutation, too, if by reprogramming adult blood cells into proliferating induced pluripotent stem cells (iPSCs), then using the popular CRISPR/cas9 technique.
Kohn told Drug Discovery & Development his team used ZFNs as “we started this work about four to five years ago when they were the main method” of gene editing. The third of the Big Three gene editing technologies, TALENs, was “just coming into use, and CRISPR was not even developed.”
Researchers at The University of Manchester have made a significant breakthrough in the development of synthetic pathways that will enable renewable biosynthesis of the gas propane. This research is part of a program of work aimed at developing the next generation of biofuels.
This study provides new insight and understanding of the development of next-generation biofuels. In this latest study, published in the journal Biotechnology for Biofuels,scientists at the University’s Manchester Institute of Biotechnology (MIB), working with colleagues at Imperial College London and University of Turku, have created a synthetic pathway for biosynthesis of the gas propane. Their work brings scientists one step closer to the commercial production of renewable propane, a vital development as fossil fuels continue to dwindle.
Professor Nigel Scrutton, Director of the MIB, explains the significance of their work: “The chemical industry is undergoing a major transformation as a consequence of unstable energy costs, limited natural resources and climate change. Efforts to find cleaner, more sustainable forms of energy as well as using biotechnology techniques to produce synthetic chemicals are currently being developed at The University of Manchester.”
Natural metabolic pathways for the renewable biosynthesis of propane do not exist but scientists at the University have developed an alternative microbial biosynthetic pathway to produce renewable propane. The team led by Nigel Scrutton and Dr Patrik Jones from Imperial College, modified existing fermentative butanol pathways using an engineered enzyme variant to redirect the microbial pathway to produce propane as opposed to butanol. The team was able to achieve propane biosynthesis creating a platform for next-generation microbial propane production.
Propane has very good physicochemical properties which allow it to be stored and transported in a compressed liquid form. While under ambient conditions it is a clean-burning gas, with existing global markets and infrastructure for storage, distribution and utilisation in a wide range of applications ranging from heating to transport fuel. Consequently, propane is an attractive target product in research aimed at developing new renewable alternatives to complement currently used petroleum-derived fuels.
In the 1970s, scientists noticed that soft-shell clams along the east coast of North America were dying from a strange type of cancer. Their blood, which was typically clear, would fill with so many cells that it would turn milky. The rogue cells clogged and infiltrated the clams’ organs, often killing them.
This clam leukemia seemed to be transmissible. If you took the blood of infected clams and injected it into healthy individuals, some of those recipients would develop the disease. For years, scientists suspected that a virus was involved Michael Metzger from Columbia University has a different explanation. His team has discovered that the thing that transmits the cancer isn’t a virus, butthe cancer itself.
The clam leukemia is a contagious cancer—an immortal line of selfish shellfish cells that originated in a single individual and somehow gained the ability to survive and multiply in fresh hosts. The vast majority of cancers are not like this; they’re not contagious. Some are caused by contagious things, like viruses (HPV causes cervical cancer) or bacteria (Helicobacter pylori causes stomach cancer), but the cells themselves cannot leave one host and start growing in another. Once their host dies, so do they.
Until Metzger’s discovery, there were just two exceptions to this rule. The first is a facial tumour that afflicts Tasmanian devils. It spreads through bites, and poses a serious threat to the survival of these animals. The second is a venereal tumour that affects dogs. It arose around 11,000 years ago and has since spread around the world. That was it: two transmissible tumors. Now, there’s a third—and perhaps more on the way. “Maybe this is way more common than we thought in invertebrates, and especially in marine ones,” says Stephen Goff, who led the study. “We’re all sitting in the same ocean here.”
Goff studies viruses that cause leukaemia in mice. His interest in clams began with a phone call from Carol Reinisch, a marine biologist at Environment Canada. “She said: We have this disease in clams. It’s a leukaemia. Can you help us check if there’s a virus involved?” he recalls. He agreed, and she sent over some blood samples.
The team discovered that the disease is associated with a jumping gene—a stretch of DNA that can copy and paste itself into a new part of the clam genome. They called it Steamer. Healthy clams have between 2 and 10 copies of it in their genomes, but the ones with leukaemia have between 150 and 300 copies. Perhaps some virus was causing Steamer to multiply extravagantly. As it jumped into new places, it disrupted important genes, and triggered cancer. Here was “an example of catastrophic induction of genetic instability that may initiate or advance the course of leukaemia,” the team wrote.
If this idea is correct, Steamer should jump into different positions with each new affected clam. Instead, the team found Steamer in many of the same positions in clams from New York, Maine, and Prince Edward Island in Canada. “That’s when we were really surprised,” says Goff. “There was something fishy going on.” Next, they compared other positions in the genomes of healthy clams, diseased clams, and tumor cells. Right away, they saw that all the tumours are genetically identical, and none of them matched the genes of their host clams. That’s the same pattern that scientists see in the dog and Tasmanian devil tumors. It’s a clear signature of a contagious cancer.