Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Flies might be smarter than you think. According to research reported in the Cell Press journal Current Biology on May 28, fruit flies know what time of day it is. What's more, the insects can learn to connect different scents with the sweet reward of sugar, depending on the hour: menthol in the morning and mushrooms in the afternoon.
Researchers say that the findings show the surprising mental abilities of animals, no matter how small. "If even the fly, with its miniature brain, has the sense of time, most animals may have it," says Martin Heisenberg of Rudolf Virchow Center in Germany.
In earlier studies, researchers showed that mice and honeybees can associate a reward--food or a mate, for instance--with a particular time of day. To understand how this memory for time works in the new study, Heisenberg and his colleagues looked to the fruit fly.
The researchers trained hungry flies to associated two different chemical odors with sugar in the morning or in the afternoon on two consecutive days. On the third day, they tested the flies' preference for one scent or the other.
The results were clear: the flies learned to switch their scent preference over the course of the day. Flies tested in the morning preferred the odor paired during training with sucrose in the morning, while flies tested in the afternoon preferred the odor paired with sucrose in the afternoon. Their ability to tell time remained as long as the two separate events were separated by a period of at least four hours.
The researchers found that the flies' time-keeping ability remained both in constant darkness and with a regular light-dark cycle. The flies couldn't keep time, however, when the lights were kept on around the clock. Flies lacking clock genes known to be important for maintaining a daily circadian rhythm still learned to like certain odors, but they couldn't associate those scents with the time.
The findings show that flies can use time as an additional clue to find what's good to eat. The next step is to explore the underlying molecular mechanism for this time-odor learning in greater detail.
"Given the formidable collection of genetic tools for studying the fly brain, this can now be achieved," Heisenberg says.
Chemists at the University of Waterloo have discovered the key reaction that takes place in sodium-air batteries that could pave the way for development of the so-called holy grail of electrochemical energy storage. The key lies in Nazar's group discovery of the so-called proton phase transfer catalyst. By isolating its role in the battery's discharge and recharge reactions, Nazar and colleagues were not only able to boost the battery's capacity, they achieved a near-perfect recharge of the cell. When the researchers eliminated the catalyst from the system, they found the battery no longer worked. Unlike the traditional solid-state battery design, a metal-oxygen battery uses a gas cathode that takes oxygen and combines it with a metal such as sodium or lithium to form a metal oxide, storing electrons in the process. Applying an electric current reverses the reaction and reverts the metal to its original form.
Understanding how sodium-oxygen batteries work has implications for developing the more powerful lithium-oxygen battery, which is has been seen as the holy grail of electrochemical energy storage. Their results appear in the journal Nature Chemistry.
"Our new understanding brings together a lot of different, disconnected bits of a puzzle that have allowed us to assemble the full picture," says Nazar, a Chemistry professor in the Faculty of Science.
Columbia Engineering researchers have created the first single-molecule diode — the ultimate in miniaturization for electronic devices — with potential for real-world applications in electronic systems. The diode that has a high (>250) rectification and a high “on” current (~ 0.1 microamps), says Latha Venkataraman, associate professor of applied physics. “Constructing a device where the active elements are only a single molecule … which has been the ‘holy grail’ of molecular electronics, represents the ultimate in functional miniaturization that can be achieved for an electronic device,” he said.
With electronic devices becoming smaller every day, the field of molecular electronics has become ever more critical in solving the problem of further miniaturization, and single molecules represent the limit of miniaturization. The idea of creating a single-molecule diode was suggested by Arieh Aviram and Mark Ratner who theorized in 1974 that a molecule could act as a rectifier, a one-way conductor of electric current.
Researchers have since been exploring the charge-transport properties of molecules. They have shown that single-molecules attached to metal electrodes (single-molecule junctions) can be made to act as a variety of circuit elements, including resistors, switches, transistors, and, indeed, diodes. They have learned that it is possible to see quantum mechanical effects, such as interference, manifest in the conductance properties of molecular junctions.
Since a diode acts as an electricity valve, its structure needs to be asymmetric so that electricity flowing in one direction experiences a different environment than electricity flowing in the other direction. To develop a single-molecule diode, researchers have simply designed molecules that have asymmetric structures. “While such asymmetric molecules do indeed display some diode-like properties, they are not effective,” explains Brian Capozzi, a PhD student working with Venkataraman and lead author of the paper.
“A well-designed diode should only allow current to flow in one direction, and it should allow a lot of current to flow in that direction. Asymmetric molecular designs have typically suffered from very low current flow in both ‘on’ and ‘off’ directions, and the ratio of current flow in the two has typically been low. Ideally, the ratio of ‘on’ current to ‘off’ current, the rectification ratio, should be very high.”
To overcome the issues associated with asymmetric molecular design, Venkataraman and her colleagues — Chemistry Assistant Professor Luis Campos’ group at Columbia and Jeffrey Neaton’s group at the Molecular Foundry at UC Berkeley — focused on developing an asymmetry in the environment around the molecular junction. They created an environmental asymmetry through a rather simple method: they surrounded the active molecule with an ionic solution and used gold metal electrodes of different sizes to contact the molecule. Their results achieved rectification ratios as high as 250 — 50 times higher than earlier designs. The “on” current flow in their devices can be more than 0.1 microamps, which, Venkataraman notes, is a lot of current to be passing through a single-molecule. And, because this new technique is so easily implemented, it can be applied to all nanoscale devices of all types, including those that are made with graphene electrodes.
It’s only a centimetre long, it’s placed under your skin, it’s powered by a patch on the surface of your skin and it communicates with your mobile phone. The new biosensor chip developed at EPFL is capable of simultaneously monitoring the concentration of a number of molecules, such as glucose and cholesterol, and certain drugs.
Using a “Gauss gun” principle, an MRI machine drives a “millirobot” through a hypodermic needle into your spinal cord and guides it into your brain to release life-threatening fluid buildup.
University of Houston researchers have developed a concept for MRI-powered millimeter-size “millirobots” that could one day perform unprecedented minimally invasive medical treatments. This technology could be used to treat hydrocephalus, for example. Current treatments require drilling through the skull to implant pressure-relieving shunts, said Aaron T. Becker, assistant professor of electrical and computer engineering at the University of Houston. But MRI scanners alone don’t produce enough force to pierce tissues (or insert needles). So the researchers drew upon the principle of the “Gauss gun.”
Here’s how the a Gauss gun works: a single steel ball rolls down a chamber, setting off a chain reaction when it smashes into the next ball, etc., until the last ball flies forward, moving much more quickly the initial ball. Based on that concept, the researchers imagine a medical robot with a barrel self-assembled from three small high-impact 3D-printed plastic components, with slender titanium rod spacers separating two steel balls.
Aaron T. Becker, assistant professor of electrical and computer engineering at the University of Houston, said the potential technology could be used to treat hydrocephalus and other conditions, allowing surgeons to avoid current treatments that require cutting through the skull to implant pressure-relieving shunts.
Becker was first author of a paper presented at ICRA, the conference of the IEEE Robotics and Automation Society, nominated for best conference paper and best medical robotics paper. “Hydrocephalus, among other conditions, is a candidate for correction by our millirobots because the ventricles are fluid-filled and connect to the spinal canal,” Becker said. “Our noninvasive approach would eventually require simply a hypodermic needle or lumbar puncture to introduce the components into the spinal canal, and the components could be steered out of the body afterwards.”
Future work will focus on exploring clinical context, miniaturizing the device, and optimizing material selection.
In theory, shape-memory metals ought to be revolutionizing every corner of technology already, from the automotive industry to biotech. These futuristic metals—which can be bent and deformed but pop back to their original shape when heated or jolted with electricity—have already existed for decades. Until now, though, every shape-memory alloy has faced the same glaring issue: they wear out, and fast. Depending on the alloy, the metals will slowly lose their ability to change shape after just a few (or if you're lucky, a few thousand) transformations. That's kept the metals in the lab and out of your car or phone.
Today a team of German and American scientists have stumbled across an alloy of shape-memory metal that just won't quit—not even after being bent and reshaped an astonishing 10 million times, an unparalleled feat.
Manfred Wuttig, a material scientist at the University of Maryland who helped lead the team, said the metal's "fortuitous discovery," was part of a long, frustrating hunt for durable shape-memory metal. As Wuttig and his colleagues detail in a new paper in the journal Science, understanding the secret to this material's hardiness may open the floodgates to a new generation of shape-memory materials that make it into the real world.
"This really is a huge breakthrough, and could make shape-memory alloys much more widely used in everyday technology" says Richard James, a leading shape-memory materials scientist at the University of Minnesota, who was not involved in the research, "I've personally made many, many [shape-memory] alloys that have various super interesting properties, but no one would be able to use them as they last only a few cycles."
The new metal keeps its astounding durability, Wuttig and James agree that scientists now have a platform to test and create new hyper-durable shape-changing alloys. While Wuttig's new alloy was only created in a thin film measuring several hundred micrometers, "the next step is to scale this up into a bulk alloy. But I see no reason why this would be an issue."
This isn't just a steppingstone to bringing shape-shifting materials into everyday products (finally), James says. "We may even start to see all the various applications we've been dreaming about over the last few decades," like biomedical metallic heart-valves or hyper-efficient solar energy converters.
Portable electronics -- typically made of non-renewable, non-biodegradable and potentially toxic materials -- are discarded at an alarming rate in consumers' pursuit of the next best electronic gadget.
In an effort to alleviate the environmental burden of electronic devices, a team of University of Wisconsin-Madison researchers has collaborated with researchers in the Madison-based U.S. Department of Agriculture Forest Products Laboratory (FPL) to develop a surprising solution: a semiconductor chip made almost entirely of wood.
The research team, led by UW-Madison electrical and computer engineering professor Zhenqiang "Jack" Ma, described the new device in a paper published today (May 26, 2015) by the journal Nature Communications. The paper demonstrates the feasibility of replacing the substrate, or support layer, of a computer chip, with cellulose nanofibril (CNF), a flexible, biodegradable material made from wood.
"The majority of material in a chip is support. We only use less than a couple of micrometers for everything else," Ma says. "Now the chips are so safe you can put them in the forest and fungus will degrade it. They become as safe as fertilizer." Zhiyong Cai, project leader for an engineering composite science research group at FPL, has been developing sustainable nanomaterials since 2009.
"If you take a big tree and cut it down to the individual fiber, the most common product is paper. The dimension of the fiber is in the micron stage," Cai says. "But what if we could break it down further to the nano scale? At that scale you can make this material, very strong and transparent CNF paper."
Working with Shaoqin "Sarah" Gong, a UW-Madison professor of biomedical engineering, Cai's group addressed two key barriers to using wood-derived materials in an electronics setting: surface smoothness and thermal expansion.
"You don't want it to expand or shrink too much. Wood is a natural hydroscopic material and could attract moisture from the air and expand," Cai says. "With an epoxy coating on the surface of the CNF, we solved both the surface smoothness and the moisture barrier."
Gong and her students also have been studying bio-based polymers for more than a decade. CNF offers many benefits over current chip substrates, she says.
"The advantage of CNF over other polymers is that it's a bio-based material and most other polymers are petroleum-based polymers. Bio-based materials are sustainable, bio-compatible and biodegradable," Gong says. "And, compared to other polymers, CNF actually has a relatively low thermal expansion coefficient."
By unlocking the secrets of a bizarre virus that survives in nearly boiling acid, scientists at the University of Virginia School of Medicine have found a blueprint for battling human disease using DNA clad in near-indestructible armor. "What's interesting and unusual is being able to see how proteins and DNA can be put together in a way that's absolutely stable under the harshest conditions imaginable," said Edward H. Egelman, PhD, of the UVA Department of Biochemistry and Molecular Genetics. "We've discovered what appears to be a basic mechanism of resistance - to heat, to desiccation, to ultraviolet radiation. And knowing that, then, we can go in many different directions, including developing ways to package DNA for gene therapy."
The virus SIRV2 belongs to a common crenarchaeal virus family, the Rudiviridae. It was first discovered in 1998 in the hot acidic sulfurous springs of Iceland. According to previous studies, SIRV2 infects Sulfolobus islandicus, a single-celled microorganism that grows optimally at 80 degrees Celsius and at pH 3. The virus has a very stable rod-shaped viral capsule, about 900 nm long and 23 nm in width.
Now, Dr Prangishvili, Dr Egelman and their colleagues have used cryo-electron microscopy to generate a 3D reconstruction of the SIRV2 virion, which revealed a previously unknown form of virion organization.
The team identified surprising similarities between SIRV2 and the spores bacteria form to survive in inhospitable environments.
“Some of these spores are responsible for very, very horrific diseases that are hard to treat, like anthrax. So we show in this study that this virus actually functions in a similar way to some of the proteins present in bacterial spores,” said Dr Egeleman, who is the senior author on the paper published in the journal Science. “Understanding how these bacterial spores work gives us potentially new abilities to destroy them,” he said.
Dr Egeleman and co-authors also found that SIRV2 survives the inhospitable conditions by forcing its DNA into what is called A-form, a structural state identified by pioneering DNA researcher Rosalind Franklin more than a half-century ago.
“This is, I think, going to highlight once again the contributions she made, because many people have felt that this A-form of DNA is only found in the laboratory under very non-biological conditions, when DNA is dehydrated or dry. Instead, it appears to be a general mechanism in biology for protecting DNA,” Dr Egelman said.
Every year, an estimated half-million Americans undergo surgery to have a stent prop open a coronary artery narrowed by plaque. But sometimes the mesh tubes get clogged. Scientists report in the journal ACS Nano a new kind of multi-tasking stent that could minimize the risks associated with the procedure. It can sense blood flow and temperature, store and transmit the information for analysis and can be absorbed by the body after it finishes its job.
Doctors have been implanting stents to unblock coronary arteries for 30 years. During that time, the devices have evolved from bare metal, mesh tubes to coated stents that can release drugs to prevent reclogging. But even these are associated with health risks. So researchers have been working on versions that the body can absorb to minimize the risk that a blood clot will form. And now Dae-Hyeong Kim, Seung Hong Choi, Taeghwan Hyeon and colleagues are taking that idea a step further.
The researchers developed and tested in animals a drug-releasing electronic stent that can provide diagnostic feedback by measuring blood flow, which slows when an artery starts narrowing. The device can also heat up on command to speed up drug delivery, and it can dissolve once it's no longer needed.
More information: Bioresorbable Electronic Stent Integrated with Therapeutic Nanoparticles for Endovascular Diseases Bioresorbable Electronic Stent Integrated with Therapeutic Nanoparticles for Endovascular Diseases, ACS Nano, Article ASAP. DOI: 10.1021/acsnano.5b00651
A French company better known for designing aircraft systems announced Wednesday that, on May 29, it will release the world’s first commercially available, scientifically accurate, simulated 3-dimensional (3D) model of a whole, healthy heart. The model may, with fine-tuning and additional development, help to revolutionize the way that cardiologists match treatments to individual heart patients.
The culmination of the first phase of Dassault Systemes' “Living Heart Project,” the simulation may soon allow physicians, medical device manufacturers and others to understand disease states and test innovative treatments without resorting to animal testing.
According to Living Heart Project director Steve Levine, it will soon be possible for cardiologists to rehearse difficult procedures using the company’s 3D modeling. Starting on May 29, when the heart model is released, doctors can use the baseline healthy heart to study congenital defects or heart disease by modifying the shape and tissue properties through the use of a software editor.
Levine says that doctors have developed models and simulations of different sections of the heart, but until now, no one had been able to put these pieces together into an holistic simulation.
“What we can now do for devices that go inside the heart is you can test it on the computer the same way you can test planes,” Levine told Mashable in an interview. The project involves 45 medical professionals, organizations and regulatory agencies, including the Food and Drug Administration (FDA), which oversees the U.S. medical industry. The FDA signed a five-year collaborative research agreement with Dassault to help oversee the development of a heart model that can be used for regulatory science.
Via Luca Baptista
Patients with aggressive skin cancer - melanoma - have been treated successfully using a drug based on the herpes virus, in a trial that could pave the way for a new generation of cancer treatments. The findings mark the first positive phase 3 trial results for cancer “virotherapy”, where one disease is harnessed and used to attack another. If approved, the drug, called T-VEC, could be more widely available for cancer patients by next year, scientists predicted.
Crucially, the therapy has the potential to overcome cancer even when the disease has spread to organs throughout the body, offering hope in future to patients who have been faced with the bleakest prognosis. Kevin Harrington, professor of biological cancer therapies at the Institute of Cancer Research London, who led the work, said: “This is the big promise of this treatment. It’s the first time a virotherapy has been shown to be successful in a phase 3 trial.”
In the trial, involving more than 400 patients with aggressive melanoma, one in four patients responded to the treatment, and 16% were still in remission after six months. About 10% of the patients treated had “complete remission”, with no detectable cancer remaining - considered a cure if the patient is still cancer-free five years after diagnosis.
The results are especially encouraging, Harrington said, because all the patients had inoperable, relapsed or metastatic melanoma with no conventional treatment options available to them. “They had disease that ranged from dozens to hundreds of deposits of melanoma on a limb all the way to patients where cancer had spread to the lungs and liver,” he said.
There is a popular misconception about Moore’s law (that the number of transistors on a chip doubles every two years) which has led many to conclude that the 50-year-old prognostication is due to end shortly. This doubling of processing power, for the same cost, has continued apace since Gordon Moore, one of Intel's founders, observed the phenomenon in 1965. At the time, a few hundred transistors could be crammed on a sliver of silicon. Today’s chips can carry billions.
Whether Moore’s law is coming to an end is moot. As far as physical barriers to further shrinkage are concerned, there is no question that, having been made smaller and smaller over the decades, crucial features within transistors are approaching the size of atoms. Indeed, quantum and thermodynamic effects that occur at such microscopic dimensions have loomed large for several years.
Until now, integrated circuits have used a two-dimensional (planar) structure, with a metal gate mounted across a flat, conductive channel of silicon. The gate controls the current flowing from a source electrode at one end of the channel to a drain electrode at the other end. A small voltage applied to the gate lets current flow through the transistor. When there is no voltage on the gate, the transistor is switched off. These two binary states (on and off) are the ones and zeros that define the language of digital devices.
However, when transistors are shrunk beyond a certain point, electrons flowing from the source can tunnel their way through the insulator protecting the gate, instead of flowing direct to the drain. This leakage current wastes power, raises the temperature and, if excessive, can cause the device to fail. Leakage becomes a serious problem when insulating barriers within transistors approach thicknesses of 3 nanometres (nm) or so. Below that, leakage increases exponentially, rendering the device pretty near useless.
Intel, which sets the pace for the semiconductor industry, started preparing for the leakage problem several “nodes” (changes in feature size) ago. At the time, it was still making 32nm chips. The solution adopted was to turn a transistor’s flat conducting channel into a vertical fence (or fin) that stood proud of the substrate. Instead of just one small contact patch, this gave the gate straddling the fence three contact areas (a large one on either side of the fence and a smaller one across the top). With more control over the current flowing through the channel, leakage is reduced substantially. Intel reckons “Tri-Gate” processors switch 37% faster and use 50% less juice than conventional ones.
Having introduced the Tri-Gate transistor design (now known generically as FinFET) with its 22nm node, Intel is using the same three-dimensional architecture in its current 14nm chips, and expects to do likewise with its 10nm ones, due out later this year and in mainstream production by the middle of 2016. Beyond that, Intel says it has some ideas about how to make 7nm devices, but has yet to reveal details. The company’s road map shows question marks next to future 7nm and 5nm nodes, and peters out shortly thereafter.
At a recent event celebrating the 50th anniversary of Moore’s law, Intel’s 86-year-old chairman emeritus said his law would eventually collapse, but that “good engineering” might keep it afloat for another five to ten years. Mr Moore was presumably referring to further refinements in Tri-Gate architecture. No doubt he was also alluding to advanced fabrication processes, such as “extreme ultra-violet lithography” and “multiple patterning”, which seemingly achieve the impossible by being able to print transistor features smaller than the optical resolution of the printing system itself.
Massive beams of selenite dwarf human explorers in Mexico’s Cave of Crystals, deep below the Chihuahuan Desert. Formed over millennia, these crystals are among the largest yet discovered on Earth. It's 50˚C and has a humidity of 100%, less than a couple of hundred people have been inside and it's so deadly that even with respirators and suits of ice you can only survive for 20 minutes before your body starts to fail. It’s the nearest thing to visiting another planet – it’s going deep inside our own.
Cueva de los Cristales is the incarnation of our most awesome science fiction imaginations - Jules Verne's Journey to the Centre of the Earth, Superman's Fortress of Solitude. At about the same time as humans first ventured out of Africa, these crystals began to slowly grow. For half a million years they remained protected and nurtured by a womb of hot hydrothermal fluids rich with minerals.
Undisturbed, one can only guess how big they may have eventually grown. Yet when mining began here over a hundred years ago, the water table was lowered and the cave drained. The crystals seemingly interminable development was frozen forever leaving them as relics of the deep earth. It wasn't until 2000 that miners, searching for lead, eventually penetrated the cave wall and brought it to light. Who knows what other wonders lie hidden deep inside the earth.
A new technique developed at Stanford University harnesses the buzz of everyday human activity to map the interior of the Earth. "We think we can use it to image the subsurface of the entire continental United States," said Stanford geophysics postdoctoral researcher Nori Nakata.
Using tiny ground tremors generated by the rumble of cars and trucks across highways, the activities within offices and homes, pedestrians crossing the street and even airplanes flying overhead, a team led by Nakata created detailed three-dimensional subsurface maps of the California port city of Long Beach.
The maps, detailed in a recent issue of the Journal of Geophysical Research, marks the first successful demonstration of an elusive Earth-imaging technique, called ambient noise body wave tomography. "It's a technique that scientists have been trying to develop for more than 15 years," said Nakata, who is the Thompson Postdoctoral Fellow at the School of Earth, Energy & Environmental Sciences.
There are two major types of seismic waves: surface waves and body waves. As their name suggests, surface waves travel along the surface of the Earth. Scientists have long been able to harness surface waves to study the upper layers of the planet's crust, and recently they have even been able to extract surface waves from the so-called ambient seismic field. Also known as ambient noise, these are very weak but continuous seismic waves that are generated by colliding ocean waves, among other things.
Body waves, in contrast, travel through the Earth, and as a result can provide much better spatial resolution of the planet's interior than surface waves. "Scientists have been performing body-wave tomography with signals from earthquakes and explosives for decades," said study coauthor Jesse Lawrence, an assistant professor of geophysics at Stanford. "But you can't control when and where an earthquake happens, and explosives are expensive and often damaging."
For this reason, geophysicists have long sought to develop a way to perform body wave tomography without relying on earthquakes or resorting to explosives. This has proven challenging, however, because body waves have lower amplitudes than surface waves, and are therefore harder to observe. "Usually you need to combine and average lots and lots of data to even see them," Lawrence said.
In the new study, the Stanford team applied a new software processing technique, called a body-wave extraction filter. Nakata developed the filter to analyze ambient noise data gathered from a network of thousands of sensors that had been installed across Long Beach to monitor existing oil reservoirs beneath the city. Using this filter, the team was able to create maps that revealed details about the subsurface of Long Beach down to a depth of more than half a mile (1.1. kilometers). The body-wave maps were comparable to, and in some cases better than, existing imaging techniques.
One map, for example, clearly revealed the Newport-Inglewood fault, an active geological fault that cuts through Long Beach. This fault also shows up in surface-wave maps, but the spatial resolution of the body-wave velocity map was much higher, and revealed new information about the velocity of seismic waves traveling through the fault's surrounding rocks, which in turn provides valuable clues about their composition and organization.
Deployment will become longest floating structure in world history.
Boyan Slat, 20-year old founder and CEO of The Ocean Cleanup, today announced that the world’s first system to passively clean up plastic pollution from the world’s oceans is to be deployed in 2016. He made the announcement at Asia’s largest technology conference, Seoul Digital Forum, in South-Korea.
The array is projected to be deployed in Q2 2016. The feasibility of deployment, off the coast of Tsushima, an island located in the waters between Japan and South-Korea is currently being researched.
The system will span 2000 meters, thereby becoming the longest floating structure ever deployed in the ocean (beating the current record of 1000 m held by the Tokyo Mega-Float). It will be operational for at least two years, catching plastic pollution before it reaches the shores of the proposed deployment location of Tsushima island. Tsushima island is evaluating whether the plastic can be used as an alternative energy source.
The scale of the plastic pollution problem, whereby in the case of Tsushima island, approximately one cubic meter of pollution per person is washed up each year, has led the Japanese the local government to seek innovative solutions to the problem.
The deployment will represents an important milestone in The Ocean Cleanup’s mission to remove plastic pollution from the world’s oceans. Within five years, after a series of deployments of increasing scale, The Ocean Cleanup plans to deploy a 100km-long system to clean up about half the Great Pacific Garbage Patch, between Hawaii and California.
Boyan Slat, founder and CEO of The Ocean Cleanup: “Taking care of the world’s ocean garbage problem is one of the largest environmental challenges mankind faces today. Not only will this first cleanup array contribute to cleaner waters and coasts but it simultaneously is an essential step towards our goal of cleaning up the Great Pacific Garbage Patch. This deployment will enable us to study the system’s efficiency and durability over time."
Engineering switchable reconfigurations in DNA-controlled nanoparticle arrays could lead to dynamic energy-harvesting or responsive optical materials
The rapid development of self-assembly approaches has enabled the creation of materials with desired organization of nanoscale components. However, achieving dynamic control, wherein the system can be transformed on demand into multiple entirely different states, is typically absent in atomic and molecular systems and has remained elusive in designed nanoparticle systems. Here, we demonstrate with in situ small-angle X-ray scattering that, by using DNA strands as inputs, the structure of a three-dimensional lattice of DNA-coated nanoparticles can be switched from an initial ‘mother’ phase into one of multiple ‘daughter’ phases. The introduction of different types of reprogramming DNA strands modifies the DNA shells of the nanoparticles within the superlattice, thereby shifting interparticle interactions to drive the transformation into a particular daughter phase. Moreover, we mapped quantitatively with free-energy calculations the selective reprogramming of interactions onto the observed daughter phases.
Scientists at the U.S. Department of Energy’s Brookhaven National Laboratory have developed the capability of creating dynamic nanomaterials — ones whose structure and associated properties can be switched, on-demand. In a paper appearing in Nature Materials, they describe a way to selectively rearrange nanoparticles in three-dimensional arrays to produce different configurations, or “phases,” from the same nano-components.
“One of the goals in nanoparticle self-assembly has been to create structures by design,” said Oleg Gang, who led the work at Brookhaven’s Center for Functional Nanomaterials (CFN), a DOE Office of Science User Facility. “Until now, most of the structures we’ve built have been static.” KurzweilAI covered that development in a previous article, “Creating complex structures using DNA origami and nanoparticles.”
The new advance in nanoscale engineering builds on that previous work in developing ways to get nanoparticles to self-assemble into complex composite arrays, including linking them together with tethers constructed of complementary strands of synthetic DNA.
“We know that properties of materials built from nanoparticles are strongly dependent on their arrangements,” said Gang. “Previously, we’ve even been able to manipulate optical properties by shortening or lengthening the DNA tethers. But that approach does not permit us to achieve a global reorganization of the entire structure once it’s already built.”
Engineers at The Ohio State University have created a circuit that makes cell phone batteries last up to 30 percent longer on a single charge. The trick: it converts some of the radio signals emanating from a phone into direct current (DC) power, which then charges the phone’s battery.
This new technology can be built into a cell phone case, adding minimal bulk and weight.
“When we communicate with a cell tower or Wi-Fi router, so much energy goes to waste,” explained Chi-Chih Chen, research associate professor of electrical and computer engineering. “We recycle some of that wasted energy back into the battery.”
“Our technology is based on harvesting energy directly from the source, explained Robert Lee, professor of electrical and computer engineering. By Lee’s reckoning, nearly 97 percent of cell phone signals never reach a destination and are simply lost. Some of the that energy can be captured.
The idea is to siphon off just enough of the radio signal to noticeably slow battery drain, but not enough to degrade voice quality or data transmission.
Cell phones broadcast in all directions at once to reach the nearest cell tower or Wi-Fi router. Chen and his colleagues came up with a system that identifies which radio signals are being wasted. It works only when a phone is transmitting.
Next, the engineers want to insert the device into a “skin” that sticks directly to a phone, or better, partner with a manufacturer to build it directly into a phone, tablet or other portable electronic device.
In a paper published today in the journal Science, researchers at MIT reveal that they were able to reactivate memories that could not otherwise be retrieved, using a technology known as optogenetics.
The finding answers a fiercely debated question in neuroscience as to the nature of amnesia, according to Susumu Tonegawa, the Picower Professor in MIT's Department of Biology and director of the RIKEN-MIT Center at the Picower Institute for Learning and Memory, who directed the research by lead authors Tomas Ryan, Dheeraj Roy, and Michelle Pignatelli.
Neuroscience researchers have for many years debated whether retrograde amnesia -- which follows traumatic injury, stress, or diseases such as Alzheimer's -- is caused by damage to specific brain cells, meaning a memory cannot be stored, or if access to that memory is somehow blocked, preventing its recall. "The majority of researchers have favored the storage theory, but we have shown in this paper that this majority theory is probably wrong," Tonegawa says. "Amnesia is a problem of retrieval impairment."
Memory researchers have previously speculated that somewhere in the brain network is a population of neurons that are activated during the process of acquiring a memory, causing enduring physical or chemical changes. If these groups of neurons are subsequently reactivated by a trigger such as a particular sight or smell, for example, the entire memory is recalled. These neurons are known as "memory engram cells."
In 2012 Tonegawa's group used optogenetics -- in which proteins are added to neurons to allow them to be activated with light -- to demonstrate for the first time that such a population of neurons does indeed exist in an area of the brain called the hippocampus. However, until now no one has been able to show that these groups of neurons do undergo enduring chemical changes, in a process known as memory consolidation. One such change, known as "long-term potentiation" (LTP), involves the strengthening of synapses, the structures that allow groups of neurons to send signals to each other, as a result of learning and experience.
To find out if these chemical changes do indeed take place, the researchers first identified a group of engram cells in the hippocampus that, when activated using optogenetic tools, were able to express a memory. When they then recorded the activity of this particular group of cells, they found that the synapses connecting them had been strengthened. "We were able to demonstrate for the first time that these specific cells -- a small group of cells in the hippocampus -- had undergone this augmentation of synaptic strength," Tonegawa says.
The researchers then attempted to discover what happens to memories without this consolidation process. By administering a compound called anisomycin, which blocks protein synthesis within neurons, immediately after mice had formed a new memory, the researchers were able to prevent the synapses from strengthening. When they returned one day later and attempted to reactivate the memory using an emotional trigger, they could find no trace of it. "So even though the engram cells are there, without protein synthesis those cell synapses are not strengthened, and the memory is lost," Tonegawa says.
But startlingly, when the researchers then reactivated the protein synthesis-blocked engram cells using optogenetic tools, they found that the mice exhibited all the signs of recalling the memory in full.
Bees around the world are at risk from a number of threats including habitat loss and the effect of pesticides, plus bacterial disease like American foulbrood. Bee colonies are also at risk from mites and parasites, especially Varroa mite parasite. Although parasites have long been associated with “colony collapse disorder”, where entire hives are wiped out, it is only recently that the magnitude of the threat has been fully realised.
The parasite concerned is a microsporidian called Nosema ceranae, which can harm adult bees and their larvae. It causes adult bees to die early, and kills the larvae before they can transform into bees. It is spread easily via airborne spores.
The parasite poses a particular threat to honeybees found in Europe and across Asia. What is new, based on earlier investigations, is the risk to larvae. Most research had only detected infections occurring with adult bees.
The enhanced risks were found from studies conducted in a laboratory, where bees were kept and various risk scenarios involving the spread of the parasite were tried out. Under certain conditions, the scientists showed, entire colonies can be wiped out through parasitic infection.
Researchers have also found that infection is not easy to treat. Adult bees can be sprayed with the chemical fumagillin; however, when the effects wear off the infection can re-emerge.
Bees are of a great ecological importance (many agricultural crops worldwide are pollinated by honeybees), so researching why bees are in decline worldwide is of importance. The research into the parasitic risks is continuing.
The research was carried out at UC San Diego. The findings have been published in the journal PLOS One, in a paper titled “Nosema ceranae Can Infect Honey Bee Larvae and Reduces Subsequent Adult Longevity.”
Dinosaurs flourished in Europe right up until the asteroid impact that wiped them out 66 million years ago, a new study shows. The theory that an asteroid rapidly killed off the dinosaurs is widely recognized, but until recently dinosaur fossils from the latest Cretaceous--the final stanza of dinosaur evolution--were known almost exclusively from North America. This has raised questions about whether the sudden decline of dinosaurs in the American and Canadian west was merely a local story.
The new study synthesizes a flurry of research on European dinosaurs over the past two decades. Fossils of latest Cretaceous dinosaurs are now commonly discovered in Spain, France, Romania, and other countries. By looking at the variety and ages of these fossils, a team of researchers led by Zoltán Csiki-Sava of the University of Bucharest'sFaculty of Geology and Geophysics has determined that dinosaurs remained diverse in European ecosystems very late into the Cretaceous.
In the Pyrenees of Spain and France, the best area in Europe for finding latest Cretaceous dinosaurs, meat and plant-eating species are present and seemingly flourishing during the final few hundred thousand years before the asteroid hit.
Dr Csiki-Sava said "For a long time, Europe was overshadowed by other continents when the understanding of the nature, composition and evolution of latest Cretaceous continental ecosystems was concerned. The last 25 years witnessed a huge effort across all Europe to improve our knowledge, and now we are on the brink of fathoming the significance of these new discoveries, and of the strange and new story they tell about life at the end of the Dinosaur Era."
Dr Steve Brusatte of the University of Edinburgh's School of GeoSciences (UK), an author on the report, added: "Everyone knows that an asteroid hit 66 million years ago and dinosaurs disappeared, but this story is mostly based on fossils from one part of the world, North America. We now know that European dinosaurs were thriving up to the asteroid impact, just like in North America. This is strong evidence that the asteroid really did kill off dinosaurs in their prime, all over the world at once."
Plankton are vital to life on Earth — they absorb carbon dioxide, generate nearly half of the oxygen we breathe, break down waste, and are a cornerstone of the marine food chain. Now, new research indicates the diminutive creatures are not only more diverse than previously thought, but also profoundly affected by their environment.
Tara Oceans, an international consortium of researchers from MIT and elsewhere that has been exploring the world’s oceans in hopes of learning more about one of its smallest inhabitants, reported their initial findings this week in a special issue of Science. From 2009 to 2012, a small crew sailed on a 110-foot schooner collecting 35,000 samples of marine microbes and viruses from 200 locations around the globe — facing pirates, high winds, and ice storms in the process. But the effort was worth it. Among the studies’ findings: millions of new genes, thousands of new viruses, insights into microbial interactions, and ocean temperature's impact on species diversity.
The researchers identified 40 million genes in the upper ocean, most of which are new to science. In comparison, the human gut microbiome only has 10 million genes. Additionally, researchers identified more than 5,000 viruses, only 39 of which were known previously.
Underneath the ocean surface, viruses, plankton, and other microbes battle one another for survival. These interactions — which are mainly parasitic in nature — are vital for maintaining diversity, as they prevent one species from dominating the environment, the study's authors found. The expedition also revealed that species diversity is shaped by ocean temperature, which is on the rise. The new plethora of data should allow researchers to build predictive models that show how microbial communities will change in a warming world and its resulting impacts on oxygen production, carbon dioxide absorption, and ecosystem dynamics.
“The finding that temperature shapes which species are present, for instance, is especially relevant in the context of climate change, but to some extent this is just the beginning,” says Chris Bowler, a plant biologist from the French National Centre for Scientific Research. “The resources we’ve generated will allow us and others to delve even deeper, and finally begin to really understand the workings of this invisible world.”
Back in 2013, we heard that nanoengineers at the University of California, San Diago (UC San Diego) had successfully used nanosponges to soak up toxins in the bloodstream. Fast-forward two years and the team is back with more nanospongey goodness, now using hydrogel to keep the tiny fellas in place, allowing them to tackle infections such as MRSA, without the need for antibiotics.
Let's start with a quick recap. In 2013, a team of researchers announced that they'd successfully managed to create nanosponges – nanoparticles coated in red blood cell membranes – that flow through the bloodstream, removing harmful toxins as they go. The red blood cell coating tricks the immune system into ignoring the nanoparticles, but the disguise also attracts pore-forming toxins that kill cells by perforating their outer membranes.
This breakthrough was ideal if you wanted to deal with harmful toxins in the bloodstream, such as snake venom, but it didn't allow for a sustained attack in a localized region. Since the initial announcement, the team has been working on improving the method, with the new study focusing on adapting it to clear up antibiotic-resistant bacterial infections.
In order to keep the nanosponges tied to a specific area, the team turned to hydrogel – a gel made of water and polymers. The team mixed the nanosponges into the hydrogel, which then holds them in place at an infected spot, allowing for all of the toxins to be removed.
Nanosponges are some three thousand times smaller than red blood cells, allowing billions to be held in every milliliter of hydrogel. The gel's pores are small enough to keep the nanosponges in, but also large enough to allow the toxins to pass through, making it an ideal agent for delivery of the treatment.
As the method doesn't involve antibiotics, it's thought that it won't be affected by existing bacterial antibiotic resistance, and the bacteria shouldn't develop any new resistance in response to the treatment.
The nanosponge/hydrogel combination was tested on MRSA-infected mice, with the team observing significantly smaller lesions on treated as opposed to untreated subjects. The tests also confirmed that hydrogel was effective at holding the nanosponges in place, with 80 percent remaining at the site of infection two days after being injected.
The UC San Diego researchers posted the results of their study in the journal Advanced Materials.
Via Jocelyn Stoller
It's smaller than your index finger, and it might be the future of implantable devices to treat a fractured spine, pinched nerve, or neurological disorder like epilepsy.
As they report in the journal Science, a team of engineers and medical researchers in Sweden has just designed a pinpoint-accurate implantable drug pump. It delivers medicine with such precision that it requires only 1 percent of the drugs doctors would otherwise need to deploy. As it demonstrated in tests on seven rats, the tiny pump can attach directly to the spine (at the root of a nerve) and inject its medicine molecule by molecule.
"In theory, we could tell you exactly how many molecules our device is delivering," says Amanda Jonsson, the bio-electronical engineer at Sweden's Linköping University who led the team. "These very small dosages could help avoid drug side effects, or be useful for medicines that we simply can't use at larger doses."
The technology is based on a compact but complicated piece of laboratory equipment called an ion pump. To put it simply, as electric current enters the ion pump one electron at a time, medicine is flung out the other end one molecule at a time. One caveat: Because of this setup, only medicines that can be electrically charged can be used with the pump. But that includes more pain medicines than you might think, including morphine and other opiates.
We nag our kids to brush their teeth well, but a few hours later, their mouths are just as full of bacteria as before they brushed..
Microbiologist Wenyuan Shi of UCLA thinks a sweet sucker might help lick the problem. Shi laments that while the cause of tooth decay is known to be an infection, dentistry today still uses a “mechanical” approach to disease. He says that there are 100 trillion bacteria in your mouth, consisting of 700 different species, but only 12 of those species cause any harm. One in particular, Streptococcus mutans, is a major factor in tooth decay.
“What we really try to do is to detect the pathogen who is responsible for the tooth decay, and treating the pathogen or get rid of the pathogen way before they are damaging the tooth,” says Shi. The challenge of that approach is that some of those bugs are actually beneficial. So Shi is working on ways to target the harmful bacteria while leaving the beneficial ones alone. “It’s like a dandelion infection in your lawn,” he says, “and if you use a general herbicide, you do kill the dandelion, but you kill the grass as well; and the moment you stop using your herbicide, who comes back first? It’s always the weeds.”
Shi looked to his Chinese roots for a traditional herbal remedy that targets only the bad bacteria. “We did a lot of the screening, and to our great surprise, one of the top hit we got out of the 2,000 medicinal herbs is licorice. And, as you know, many cultures have been chewing the licorice roots as a way to actually promoting oral health,” he says.
As they reported in the Journal of Natural Products, Shi’s team isolated the active compounds in licorice and showed they kill decay-causing bacteria in lab tests. With corporate partner C3-Jian, Inc., they developed an extract that would specifically combat S. mutans. To get the compounds into extended contact with teeth, they put them in a lollipop, manufactured and sold by Dr. John’s Candies, which specializes in sugar-free candy. The lollipops are orange flavored.
You can’t get the same effect from just eating licorice. Most licorice sold in the U.S. is actually flavored with anise. Plus it contains lots of sugar, which is bad for your teeth. Real licorice falls under the “generally recognized as safe” category by the FDA so the lollipops are already on the market, and starting to show up in dentists’ offices and pharmacies.
At this year’s Consumer Electronics Show in Las Vegas, the big theme was the “Internet of things” — the idea that everything in the human environment, from kitchen appliances to industrial equipment, could be equipped with sensors and processors that can exchange data, helping with maintenance and the coordination of tasks.
Realizing that vision, however, requires transmitters that are powerful enough to broadcast to devices dozens of yards away but energy-efficient enough to last for months — or even to harvest energy from heat or mechanical vibrations.
“A key challenge is designing these circuits with extremely low standby power, because most of these devices are just sitting idling, waiting for some event to trigger a communication,” explains Anantha Chandrakasan, the Joseph F. and Nancy P. Keithley Professor in Electrical Engineering at MIT. “When it’s on, you want to be as efficient as possible, and when it’s off, you want to really cut off the off-state power, the leakage power.”
This week, at the Institute of Electrical and Electronics Engineers’ International Solid-State Circuits Conference, Chandrakasan’s group will present a new transmitter design that reduces off-state leakage 100-fold. At the same time, it provides adequate power for Bluetooth transmission, or for the even longer-range 802.15.4 wireless-communication protocol.
“The trick is that we borrow techniques that we use to reduce the leakage power in digital circuits,” Chandrakasan explains. The basic element of a digital circuit is a transistor, in which two electrical leads are connected by a semiconducting material, such as silicon. In their native states, semiconductors are not particularly good conductors. But in a transistor, the semiconductor has a second wire sitting on top of it, which runs perpendicularly to the electrical leads. Sending a positive charge through this wire — known as the gate — draws electrons toward it. The concentration of electrons creates a bridge that current can cross between the leads.
To generate the negative charge efficiently, the MIT researchers use a circuit known as a charge pump, which is a small network of capacitors — electronic components that can store charge — and switches. When the charge pump is exposed to the voltage that drives the chip, charge builds up in one of the capacitors. Throwing one of the switches connects the positive end of the capacitor to the ground, causing a current to flow out the other end. This process is repeated over and over. The only real power drain comes from throwing the switch, which happens about 15 times a second.
To make the transmitter more efficient when it’s active, the researchers adopted techniques that have long been a feature of work in Chandrakasan’s group. Ordinarily, the frequency at which a transmitter can broadcast is a function of its voltage. But the MIT researchers decomposed the problem of generating an electromagnetic signal into discrete steps, only some of which require higher voltages. For those steps, the circuit uses capacitors and inductors to increase voltage locally. That keeps the overall voltage of the circuit down, while still enabling high-frequency transmissions.
What those efficiencies mean for battery life depends on how frequently the transmitter is operational. But if it can get away with broadcasting only every hour or so, the researchers’ circuit can reduce power consumption 100-fold.