NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
Watching another episode on Netflix, reading the Guardian online and downloading apps are not obvious ways to pollute the atmosphere. But collectively, our growing appetite for digital services means the data centers that power them are now responsible for about 2% of global greenhouse gas emissions, a similar share to aviation.
Varying from a small room with servers to vast farms with a floor area of 150,000 square meters, data centers are big energy users. As well as requiring power to run the equipment that stores and serves us cloud computing and on-demand music, films and entertainment, those servers also generate a lot of heat and require huge amounts of energy to keep them cool. That’s why big data users such as Facebook are siting their centers in cool climates such as northern Sweden.
Individually, our everyday browsing has a relatively minuscule impact. Google, in response to claims that each search on its site generated as much CO2 as boiling half the water for a cup of coffee (7g), calculated the true figure was much lower, at 0.2g. Watching a YouTube video of cats was higher – 1g for every 10 minutes of viewing – while using Gmail for a year produced about 1.2kg a user.
As with emissions from air travel, the real issue with data centers is the rate at which they are growing. “It’s an exponential growth in data,” said Sophia Flucker, director at Operational Intelligence, a UK-based consultancy that advises data centers on their energy use. “Although IT efficiency is improving, and we can do more with less power, the demand is still there,” she said. “Any efficiency gains are being eaten up by demand. It’s very much an upward trajectory.”
A study led by researchers at the University of Washington and the National Oceanic and Atmospheric Administration connects the unprecedented West Coast toxic algal bloom of 2015 that closed fisheries from Southern California.
"We have toxic algae events that result in shellfish closures off the Washington and Oregon coast every three to five years or so, but none of them have been as large as this one," said lead author Ryan McCabe, a research scientist at the UW's Joint Institute for the Study of the Atmosphere and Ocean, a collaborative center with NOAA. "This one was entirely different, and our results show that it was connected to the unusual ocean conditions."
The study is now online in Geophysical Research Letters, a journal of the American Geophysical Union. "This paper is significant because it identifies a link between ocean conditions and the magnitude of the toxic bloom in 2015 that resulted in the highest levels of domoic acid contamination in the food web ever recorded for many species," said co-author Kathi Lefebvre, a marine biologist at NOAA's Northwest Fisheries Science Center. "This is an eye-opener for what the future may hold as ocean conditions continue to warm globally."
The authors found that the 2015 harmful algal bloom, which set records for its spatial extent and the level of toxicity, was dominated by a single species of diatom, Pseudo-nitzschia australis, normally found farther south off California.
Warm water not only allowed this species to survive, it also created an environment favoring its growth. By early 2015 the warm "blob" had moved toward shore and spread all along the West Coast. Warmer water creates less dense surface water that is more likely to stay floating on the surface, where it can become depleted in nutrients.
Previous laboratory studies by co-author William Cochlan of San Francisco State University showed that P. australis can take up nitrogen very quickly from a variety of sources, and appear to outcompete other, nontoxic phytoplankton in nutrient-depleted warm water.
Light, when strongly concentrated, is enormously powerful. Now, a team of physicists led by Professor Jörg Schreiber from the Institute of Experimental Physics – Medical Physics, which is part of the Munich-Centre for Advanced Photonics (MAP), a Cluster of Excellence at LMU Munich, has used this energy source with explosive effect. The researchers focus high-power laser light onto beads of plastic just a few micrometers in size. The concentrated energy blows the nanoparticles apart, releasing radiation made up of positively charged atoms (protons). Such proton beams could be used in future for treating tumors, and in advanced imaging techniques. Their findings appear in the journal Physical Review E.
A team of researchers from Germany and Canada has found a way to make graphene superconductive—by doping it with lithium atoms.
By now, most everyone in the science community is aware of graphene, the single carbon atom layer of material that is being studied to figure out how it can be mass produced and connected to other devices to take advantage of its superior electrical properties. Some have also been looking into whether the material could be made into a superconductor—prior research a decade ago showed that graphite could be made superconductive by coating it with other materials. Since that time, the search has been on to find just the right coating for graphene. Three years ago, a group in Italy created a model that suggested lithium might be the right choice, now, based on the work done by this latest team, it appears that they might have been right.
In this effort, the researches first grew samples of graphene on a silicon-carbide substrate—those samples were then placed in a vacuum and cooled to 8K and were then "decorated" very precisely with a layer of lithium atoms. To convince themselves that the result was superconductive, the team tested the material with angle-resolved photoemission spectroscopy—doing so revealed that electrons sent through the material slowed down, which they suggest was the result of electron-phonon coupling (the creation of Cooper pairs)—one of the hallmarks of a superconductor. The team also identified an energy gap between those electrons that were conducting and those that were not, energy that would be needed to brake electron-phonon coupling.
Further tests will have to be done to discover if it is possible to demonstrate a complete loss of electrical resistance and the expulsion of an external magnetic field, tests that can prove the material to be a true superconductor.
Rice University scientists have invented a technology that could potentially identify hundreds of bacterial pathogens simply, quickly and at low cost using a single set of random DNA probes. Rice’s “universal microbial diagnostic,” or UMD, uses pieces of randomly assembled DNA and mathematical techniques that were originally pioneered for signal processors inside digital phones and cameras.
In a paper online this week in Science Advances, Rice’s research team used lab tests to verify that UMD could identify 11 known strains of bacteria using the same five random DNA probes. Because the probes are not specific to a particular disease, the technology provides a genomic-based bacterial identification system that does not require a one-to-one ratio of DNA probes to pathogenic species.
Hydrogen is often considered a fuel for the future, in the form of fuel cells to power electric motors or burned in internal combustion engines. But finding a practical, inexpensive and nontoxic way to produce large amounts of hydrogen gas – especially by splitting water into its component parts, hydrogen and oxygen – has been a challenge.
A team of researchers from the University of Houston and the California Institute of Technology has reported a more efficient catalyst, using molybdenum sulfoselenide particles on three-dimensional porous nickel diselenide foam to increase catalytic activity. The foam, made using commercially available nickel foam, significantly improved catalytic performance because it exposed more edge sites, where catalytic activity is higher than it is on flat surfaces, said Zhifeng Ren, MD Anderson Professor of physics at UH.
Ren is lead author of a paper in Nature Communications describing the discovery. Other researchers involved include Haiqing Zhou, Fang Yu, Jingying Sun, Ran He, Shuo Chen, Jiming Bao and Zhuan Zhu, all of UH, and Yufeng Huang, Robert J. Nielsen and William A. Goddard III of the California Institute of Technology. "With the massive consumption of fossil fuels and its detrimental impact on the environment, methods of generating clean power are urgent," the researchers wrote. "Hydrogen is an ideal carrier for renewable energy; however, hydrogen generation is inefficient because of the lack of robust catalysts that are substantially cheaper than platinum."
Platinum catalysts have the highest efficiency rate for hydrogen evolution, said Ren, who also is a principal investigator at the Texas Center for Superconductivity. But platinum is rare, difficult to extract and too expensive for practical use, he said, and researchers continue to seek less expensive ways to split water into its component parts. Currently, most hydrogen is produced through steam methane reforming and coal gasification; those methods raise the fuel's carbon footprint despite the fact that it burns cleanly.
Molybdenum sulfoselenide and similar layered compounds have shown promise as catalysts, but so far no one has boosted their performance to viable levels in bulk form. The researchers say most active catalysis on those layered compounds, known as layered transition-metal dichalcogenides, or LTMDs, takes place at the edges, making the idea of a substrate with a large number of exposed edges more desirable. Also, they wrote, "arranging two different materials into hybrids might lead to synergistic effects that utilize the best properties of each component."
Their hybrid catalyst is composed of molybdenum sulfoselenide particles with vertically aligned layers on a 3-D porous conductive nickel diselenide scaffold. Testing determined that the hybrid catalyst required 69 millivolts from an external energy source to achieve a current density of 10 milliamps per square centimeter, which the researchers said is much better than many previously reported tests. In this case, the current "splits" the water, converting it to hydrogen at the cathode. Achieving the necessary current density with lower voltage improves energy conversion efficiency and reduces preparation costs.
A platinum catalyst required 32 millivolts in the testing, but Ren said ongoing testing has reduced the hybrid catalyst requirements to about 40 millivolts, close to the platinum requirements. Equally important, he said, was the ability to increase current output at a faster rate than the increase in required energy input. The catalyst remained stable after 1,000 cycles at a constant current.
One day, microrobots may be able to swim through the human body like sperm or paramecia to carry out medical functions in specific locations. Researchers from the Max Planck Institute for Intelligent Systems in Stuttgart have developed functional elastomers, which can be activated by magnetic fields to imitate the swimming gaits of natural flagella, cilia and jellyfish. Using a specially developed computer algorithm, the researchers can now automatically generate the optimal magnetic conditions for each gait for the first time. According to the Stuttgart-based scientists, other applications for this shape-programming technology include numerous other micro-scale engineering applications, in which chemical and physical processes are implemented on a miniscule scale.
A sperm is equipped with a flagellum (tail-like extension), which can beat constantly back and forth to propel the sperm towards an egg. Researchers from the Max Planck Institute for Intelligent Systems in Stuttgart have now enabled an extremely thin strip of silicone rubber, which is just a few millimetres in length, to achieve a very similar swimming pattern. To do this, they embedded magnetizable neodymium-iron-boron particles into an elastic silicone rubber and subsequently magnetized this elastomer in a controlled way. Once the elastomer is placed under a specified magnetic field, the scientists were then able to control the elastomer's shape, making it beat back and forth in a wave-like fashion.
The scientists also succeeded in imitating the complex rowing movement of a cilium in a very similar way. Cilia are extremely fine hairs found on the surface of paramecia – they propel the organisms forward by using highly complex rowing strokes. The researchers also constructed an artificial jellyfish that has two soft tentacles, which have been programmed to carry out rowing-like swimming movements.
The crucial factor behind all of these movement processes is that different areas of the elastomer can react differently to an external magnetic field: some zones have to be attracted and others repelled. Otherwise, the elastomer would not be able to reshape into a wave or begin to roll up at its ends.
In order to enable different magnetic response along the elastomer, the researchers leveraged two key ideas: "Firstly, we varied the density of the magnetizable particles along the elastomer and secondly we also controlled the magnetization orientation of these particles," explains Guo Zhan Lum, a scientist in the Department of Physical Intelligence at the Max Planck Institute in Stuttgart.
Stanford researchers accidentally discovered that iron nanoparticles invented for anemia treatment have another use: triggering the immune system's ability to destroy tumor cells.
Iron nanoparticles can activate the immune system to attack cancer cells, according to a study led by researchers at the Stanford University School of Medicine.
The nanoparticles, which are commercially available as the injectable iron supplement ferumoxytol, are approved by the Food and Drug Administration to treat iron deficiency anemia.
The mouse study found that ferumoxytol prompts immune cells called tumor-associated macrophages to destroy cancer cells, suggesting that the nanoparticles could complement existing cancer treatments. The discovery, described in a paper published online Sept. 26 in Nature Nanotechnology, was made by accident while testing whether the nanoparticles could serve as Trojan horses by sneaking chemotherapy into tumors in mice.
"It was really surprising to us that the nanoparticles activated macrophages so that they started to attack cancer cells in mice," said Heike Daldrup-Link, MD, who is the study's senior author and an associate professor of radiology at the School of Medicine. "We think this concept should hold in human patients, too."
Daldrup-Link's team conducted an experiment that used three groups of mice: an experimental group that got nanoparticles loaded with chemo, a control group that got nanoparticles without chemo and a control group that got neither. The researchers made the unexpected observation that the growth of the tumors in control animals that got nanoparticles only was suppressed compared with the other controls.
The researchers conducted a series of follow-up tests to characterize what was happening. Experimenting with cells in a dish, they showed that immune cells called tumor-associated macrophages were required for the nanoparticles' anti-cancer activity; in cell cultures without macrophages, the iron nanoparticles had no effect against cancer cells.
A team of international researchers has sequenced the genome of the seagrass Zostera marina to gain insight into how the flowering plant re-adapted to saltwater living. As the team led by Thorsten Reusch at the GEOMAR Helmholtz Center for Ocean Research-Kiel and Yves Van de Peer from Ghent University reported today in Nature, the Z. marina genome lost a number of genes that are integral for other angiosperms. At the same time, it regained functions that other flowering plants have lost.
Seagrasses are the only flowering plants that have returned to a marine environment and they are found throughout the temperate northern hemisphere in both the Atlantic and Pacific Oceans. There, they form underwater meadows in which a great number of species live, including sea otters, halibut, and clams, noted Susan Williams from the Bodega Marine Laboratory at the University of California, Davis, in a related Nature commentary.
But these environments are threatened, the researchers noted. "All this makes seagrass interesting for the study of the relationship between the complex gene networks affecting temperature tolerance, like climate warming, and the mechanisms of salt tolerance through osmoregulation," said first author Jeanine Olsen, a professor of marine biology at the University of Groningen, in a statement.
She and her colleagues collected Z. marina, also known as eelgrass, from the Archipelago Sea, southwest of Finland, for sequencing. Using a combination of fosmid-ends and whole-genome shotgun approaches, they generated a 202.3-megabase Z. marina genome that encodes some 20,450 protein-coding genes. Nearly 87 percent of those protein-coding genes are supported by transcriptomic data, they noted.
Based on an analysis of synonymous substitution, the researchers reported that the Z. marina genome harbors echoes of an ancient whole-genome duplication event that they estimated took place between 72 million years and 64 million years ago —after the divergence of Zostera and the freshwater duckweed Spirodela some 135 million and 107 million years ago. This, they said, indicates a duplication event that's independent from the two reported in Spirodela.
The researchers also noted transposable element activity in the Z. marinagenome and that genes gained by eelgrass tended to be closer to such elements than conserved genes.
Olsen and her colleagues then mapped those gains and losses of gene families onto a phylogenetic tree. While the researchers found that Zosteraand Spirodela share a number of genes, the Zostera genome has lost a number of genes linked to its saltwater home.
For instance, it has lost all genes involved in stomatal differentiation. In land plants, stomatas on leaves are a key structure that enables them to regulate gas exchange and prevent water loss. These pores, added Bodega's Williams, aren't essential in seagrass as they don't contend as much with moisture loss and instead absorb gasses directly through their outer cell layers.
The Zostera genome has also lost genes involved in volatile synthesis and sensing pathways, including ethylene sensing, and in UV damage response. Volatile compound sensing is a defense mechanism against insects, which seagrass doesn't have to contend with as much, while UV-induced damage is also less of an issue in seagrass' dimly lit submarine environment.
At the same time, the Zostera genome has gained genes that enable it to adapt to its environment.
Scientists have long understood that mother’s milk provides immune protection against some infectious agents through the transfer of antibodies, a process referred to as “passive immunity.”
A research team at the University of California, Riverside now shows that mother’s milk also contributes to the development of the baby’s own immune system by a process the team calls “maternal educational immunity.”
Specific maternal immune cells in the milk cross the wall of the baby’s intestine to enter an immune organ called the thymus. Once there, they “educate” developing cells to attack the same infectious organisms to which the mother has been exposed.
The research, which used mouse foster nursing models, has important implications for vaccinating newborn babies. The researchers show that you can vaccinate the mother and this results in vaccination of the baby through this process.
“It’s another way moms provide immune information to their babies,” said Ameae Walker, a professor of biomedical sciences in the UC Riverside School of Medicine, who led the research. “It’s as though the mother is saying, ‘Look what I have seen in the environment that you need to be immune to as well.’ The replicas – the copies of the maternal immune cells that the baby makes – will provide immunity to the baby for life.”
The research results appear in the Sept. 15 issue of the Journal of Immunology
Like everything else in the Universe, stars come in a variety of shapes and sizes, and colors, and three of which are interconnected.
The wavelength at which a star emits the most light is called the star’s “peak wavelength” (which known as Wien’s Law), which is the peak of its Planck curve. However, how that light appears to the human eye is also mitigated by the contributions of the other parts of its Planck curve.
In short, when the various colors of the spectrum are combined, they appear white to the naked eye. This will make the apparent color of the star appear lighter than where star’s peak wavelength falls on the color spectrum. Consider our Sun. Despite the fact that its peak emission wavelength corresponds to the green part of the spectrum, its color appears pale yellow.
A star’s composition is the result of its formation history. Ever star is born of a nebula made up of gas and dust, and each one is different. While nebulas in the interstellar medium are largely composed of hydrogen, which is the main fuel for star creation, they also carry other elements. The overall mass of the nebula, as well as the various elements that make it up, determine what kind of star will result.
The change in color these elements add to stars is not very obvious, but can be studied thanks to the method known as spectroanalysis. By examining the various wavelengths a star produces using a spectrometer, scientists are able to determine what elements are being burned inside.
The other major factor effecting a star’s color is its temperature. As stars increase in heat, the overall radiated energy increases, and the peak of the curve moves to shorter wavelengths. In other words, as a star becomes hotter, the light it emits is pushed further and further towards the blue end of the spectrum. As stars grow colder, the situation is reversed (see picture).
A third and final factor that will effect what light a star appears to be emitting is known as the Doppler Effect. When it comes to sound, light, and other waves, the frequency can increase or decrease based on the distance between the source and the observer.
When it comes to astronomy, this effect causes the what is known as “redshift” and “blueshift” – where the visible light coming from a distant star is shifted towards the red end of the spectrum if it is moving away, and the blue end if it is moving closer.
Astronomers using NASA's Hubble Space Telescope have imaged what may be water vapor plumes erupting off the surface of Jupiter's moon Europa. This finding bolsters other Hubble observations suggesting the icy moon erupts with high altitude water vapor plumes.
The composite image shows suspected plumes of water vapor erupting at the 7 o’clock position off the limb of Jupiter’s moon Europa. The plumes, photographed by NASA’s Hubble’s Space Telescope Imaging Spectrograph, were seen in silhouette as the moon passed in front of Jupiter. Hubble’s ultraviolet sensitivity allowed for the features — rising over 100 miles (160 kilometers) above Europa’s icy surface — to be discerned. The water is believed to come from a subsurface ocean on Europa. The Hubble data were taken on January 26, 2014. The image of Europa, superimposed on the Hubble data, is assembled from data from the Galileo and Voyager missions.
Astronomers using NASA’s Hubble Space Telescope have imaged what may be water vapor plumes erupting off the surface of Jupiter’s moon Europa. This finding bolsters other Hubble observations suggesting the icy moon erupts with high altitude water vapor plumes. The observation increases the possibility that missions to Europa may be able to sample Europa’s ocean without having to drill through miles of ice.
“Europa’s ocean is considered to be one of the most promising places that could potentially harbor life in the solar system,” said Geoff Yoder, acting associate administrator for NASA’s Science Mission Directorate in Washington. “These plumes, if they do indeed exist, may provide another way to sample Europa’s subsurface.”
The plumes are estimated to rise about 125 miles (200 kilometers) before, presumably, raining material back down onto Europa’s surface. Europa has a huge global ocean containing twice as much water as Earth’s oceans, but it is protected by a layer of extremely cold and hard ice of unknown thickness. The plumes provide a tantalizing opportunity to gather samples originating from under the surface without having to land or drill through the ice.
The team, led by William Sparks of the Space Telescope Science Institute (STScI) in Baltimore observed these finger-like projections while viewing Europa’s limb as the moon passed in front of Jupiter.
The original goal of the team’s observing proposal was to determine whether Europa has a thin, extended atmosphere, or exosphere. Using the same observing method that detects atmospheres around planets orbiting other stars, the team realized if there was water vapor venting from Europa’s surface, this observation would be an excellent way to see it.
“The atmosphere of an extrasolar planet blocks some of the starlight that is behind it,” Sparks explained. “If there is a thin atmosphere around Europa, it has the potential to block some of the light of Jupiter, and we could see it as a silhouette. And so we were looking for absorption features around the limb of Europa as it transited the smooth face of Jupiter.”
Double-helix molecules are frequently encountered in biological and synthetic organic systems, where they typically provide improved strength and better electrical properties relative to materials containing linear chains or single helices. DNA is the defining example. A purely inorganic double helix has been hard to come by, until now.
A team of some 20 researchers led by Tom Nilges of the Technical University of Munich has prepared the first completely inorganic substance, SnIP, featuring a well-defined double-helix structure. This semiconducting material consists of a twisted tin iodide (SnI+) chain intertwined with a twisted phosphide (P–) chain. The team prepared gram amounts of SnIP by heating tin, red phosphorus, and tin tetra-iodide together (Adv. Mater.2016, DOI: 10.1002/adma.201603135).
Chemists have been seeking out inorganic double helices for decades. Researchers have reported X-ray crystal structures of bulk LiP and LiAs containing spiral and coaxial chains, but it remained unclear as to whether they should be called double-helix structures. More recently, researchers have attempted making metal or metal salt double-helix materials using nanotubes or DNA as templates. But a non-templated, carbon-free example with a fully characterized double-helix structure had remained elusive.
The Food and Drug Administration approved a so-called artificial pancreas Wednesday. The first-of-its-kind device, the size of a cell phone, monitors and treats patients with type 1 diabetes, also known as juvenile diabetes.
In those with type 1 diabetes, the pancreas does not produce enough insulin, a hormone people need to get energy from food. The Medtronic MiniMed 670G system continuously monitors glucose (blood sugar) levels and delivers needed insulin to patients.
"This is a revolutionary day for the treatment of diabetes. We've been long awaiting the artificial pancreas, and it's exciting to see it," said Dr. Robert Courgi, an endocrinologist at Northwell Health's South Side Hospital in Bay Shore, New York.
The device, which requires a prescription and will become available during the spring, is intended for patients 14 or older, according to the company's website.
The Medtronic system includes a glucose meter (an electrode under the skin), an insulin pump strapped to the body and an infusion patch connected to the pump, with a tiny catheter for delivering insulin. The system measures a patient's glucose levels every five minutes and either administers or withholds insulin as needed, helping patients maintain glucose levels within the normal range the majority of the time.
We normally think of an emotion as the internal awareness of a feeling, but there’s more to it than that, says Clint Perry at Queen Mary University of London. Physical changes to your body and shifts in your behavior accompany sensations of happiness or sadness.
“Many of these things actually cause the subjective feeling that we have,” says Perry. “Those are all necessary parts of emotion.” Researchers can measure those adjustments in behavior when they’re studying emotions in animals, he says.
When humans feel happy, we’re more likely than normal to respond to an ambiguous situation with optimism. If you’re happily eating a chocolate bar, for example, you might view a newcomer as a friend instead of a possible foe, says Perry.
Happiness also makes us quicker to shrug off a negative experience, such as getting cut off in traffic.
To see whether bumblebee behavior follows similar patterns, Perry and his colleagues trained 24 bees to associate particular locations and colors in the lab with cylinders of sugar water or plain water. They then closed these cylinders off, gave half the insects a dollop of water spiked with sugar – the bee equivalent of a bit of chocolate – and measured the time it took them to enter a separate container. This was located midway between the two closed cylinders and had an intermediate color, to make the bees unsure whether it contained rewarding sugar water or not.
Bumblebees that received the sugar jolt were faster to enter the ambiguous middle station than those that didn’t, as if they were more optimistic about the possibility that it contained food.
And it wasn’t simply that they were more active because of the energy boost. Sugar-dosed bees and their sugarless peers were equally quick to visit a container that the insects knew contained food, and equally slow to visit a station with just plain water.
Researchers from the National Institute of Standards and Technology (NIST) and collaborators have proposed a design for the first DNA sequencer based on an electronic nanosensor that can detect tiny motions as small as a single atom.
The proposed device—a type of capacitor, which stores electric charge—is a tiny ribbon of molybdenum disulfide suspended over a metal electrode and immersed in water. The ribbon is 15.5 nm long and 4.5 nm wide. Single-stranded DNA, containing a chain of bases (bits of genetic code), is threaded through a hole 2.5 nm wide in the thin ribbon. The ribbon flexes only when a DNA base pairs up with and then separates from a complementary base affixed to the hole. The membrane motion is detected as an electrical signal.
As described in a new paper, the NIST team made numerical simulations and theoretical estimates to show the membrane would be 79 to 86 percent accurate in identifying DNA bases in a single measurement at speeds up to about 70 million bases per second. Integrated circuits would detect and measure electrical signals and identify bases. The results suggest such a device could be a fast, accurate and cost-effective DNA sequencer, according to the paper.
Conventional sequencing, developed in the 1970s, involves separating, copying, labeling and reassembling pieces of DNA to read the genetic information. Newer methods include automated sequencing of many DNA fragments at once—still costly—and novel "nanopore sequencing" concepts. For example, the same NIST group recently demonstrated the idea of sequencing DNA by passing it through a graphene nanopore, and measuring how graphene's electronic properties respond to strain.
The latest NIST proposal relies on a thin film of molybdenum disulfide—a stable, layered material that conducts electricity and is often used as a lubricant. Among other advantages, this material does not stick to DNA, which can be a problem with graphene. The NIST team suggests the method might even work without a nanopore—a simpler design—by passing DNA across the edge of the membrane.
"This approach potentially solves the issue with DNA sticking to graphene if inserted improperly, because this approach does not use graphene, period," NIST theorist and lead author Alex Smolyanitsky said. "Another major difference is that instead of relying on the properties of graphene or any particular material used, we read motions electrically in an easier way by forming a capacitor. This makes any electrically conductive membrane suitable for the application."
A novel project has emerged from the research environment at the Department of Micro- and Nanosystem Technology (IMST): The legendary labyrinth from the 80s computer game Pac-Man has been recreated in microscale, with a diameter of under a millimeter, and filled with microscopic prey and predators swimming around in a nutrient rich fluid.
The single celled organisms Euglena and Ciliates can be viewed as «Pac-Men», being hunted by "Ghosts" in the form of the multicellular microscopic animals called Rotifers. Using micro scenography the films creator, Adam Bartley, used neon lighting to recreate the staging familiar from Pac-Man and capture it on film.
Although the simplest lifeforms appear to move around randomly, researchers discovered an eye-catching behavior in the multicellular rotifers: "When the rotifers were first introduced into the labyrinth, they were very cautious and proceeded slowly. However, after a lag of about a day, this changed and they dashed forward in a more focused way." Johannessen suggests that this may be due to chemical traces they leave behind, making it easier for them to find their way forward.
Digital tracing of paths taken by the different species, in order to elucidate if there is logic in the way that they maneuver, will be one focus in a continuation of the project. "By creating a more complex habitat, in the form of a labyrinth, it makes it possible to establish zones that may be favorable for the organisms one is studying. Johannessen comments that behavioral changes on repeated exposure to the same environmental conditions could be revealed using digital tracing."
Why do humans kill each other? It's a question that has been posed for millennia. At least part of the answer may lie in the fact that humans have evolved from a particularly violent branch of the animal family tree, according to a new study.
From the seemingly lovable lemur to the crafty chimpanzeeand mighty gorilla, the mammalian order of primates — to which humans belong — kill within their own species nearly six times more often than the average mammal does, Spanish researchers found.
Whales rarely kill each other; the same goes for bats and rabbits. Some species of felines and canines occasionally kill others within their own species — for example, when sparring over territory or mates. Yet most primates use lethal violence with greater frequency than these other animal groups, sometimes even killing their fellow species members in organized raids.
Scientists have developed a new theory of freezing and melting of metals such as iron or copper.
The new results are published online, in the scientific journal Nature Communications. The scientists behind the new study hope that the theory will bring them closer to understanding how metals develop under the extreme pressure inside the Earth and how the fluid metals solidify as they lose heat to their surroundings and cool.
The new results show how the temperature at which a substance melts--the melting point--changes at higher pressure. “The melting temperature typically increases when we increase the pressure. For example iron melts at 1,538 degrees centigrade at one atmospheres pressure. But at the high pressure of the Earth’s core, iron first melts at more than 5,000 degrees,” says lead author Ulf Rørbæk Pedersen, from Roskilde University, Denmark.
“The questions is, how melting and freezing phenomena changes as the pressure increases. An interesting suggestion is Lindemann’s theory from 1910,” says Pedersen.
When you heat up a crystal, the molecules start to move around as they vibrate around their positions within the crystal. Lindemann suggests that at some point the vibrations become so violent that the crystal simply breaks down and melts.
“Now for the first time, we can understand how big the vibrations need to be before the crystal melts and that depends on the pressure, which is the opposite of what Lindemann thought,” says Pedersen.
The physicists can now predict how quickly the fluid melts when the melting point is reached and conversely, how quickly the atoms organize themselves when the substance begins to crystallize.
Earth is on track to sail past the two degree Celsius (3.6 degrees Fahrenheit) threshold for dangerous global warming by 2050, seven of the world's top climate scientists warned Thursday.
"Climate change is happening now, and much faster than anticipated," said Sir Robert Watson, former head of the UN's Intergovernmental Panel for Climate Change (IPCC), the body charged with distilling climate science for policy makers.
Since 1990, devastating weather-related events—floods, drought, more intense storms, heat waves and wild fires—due to climate change have doubled in number, Watson and the other scientists said in a report.
"Without additional efforts by all major emitters (of greenhouse gases), the 2C target could be reached even sooner," he told journalists in a phone briefing.
The planet has already heated up 1.0 C (1.8 F) above the pre-industrial benchmark, and could see its first year at 1.5 C within a decade, scientists reported at a conference in Oxford last week.
The Paris Agreement, inked by 195 nations in December, set an even more ambitious target, vowing to cap warming at "well under" 2C, and even 1.5C if possible. The pact will likely enter into force by the end of year, record speed for an international treaty.
IBM announced recently the Watson-based “Project DataWorks,” the first cloud-based data and analytics platform to integrate all types of data and enable AI-powered decision-making.
Project DataWorks is designed to make it simple for business leaders and data professionals to collect, organize, govern, and secure data, and become a “cognitive business.”
Achieving data insights is increasingly complex, and most of this work is done by highly skilled data professionals who work in silos with disconnected tools and data services that may be difficult to manage, integrate, and govern, says IBM. Businesses must also continually iterate their data models and products — often manually — to benefit from the most relevant, up-to-date insights.
IBM says Project DataWorks can help businesses break down these barriers by connecting all data and insights for their users into an integrated, self-service platform.
Available on Bluemix, IBM’s Cloud platform, Project DataWorks is designed to help organizations:
Automate the deployment of data assets and products using cognitive-based machine learning and Apache Spark;
Ingest data faster than any other data platform, from 50 to hundreds of Gbps, and all endpoints: enterprise databases, Internet of Things, weather, and social media;
Leverage an open ecosystem of more than 20 partners and technologies, such as Confluent, Continuum Analytics, Galvanize, Alation, NumFOCUS, RStudio, Skymind, and more.
The CRISPR-Cas adaptive immune system defends microbes against foreign genetic elements via DNA or RNA-DNA interference. A group of scientists now characterize the Class 2 type VI-A CRISPR-Cas effector C2c2 and demonstrate its RNA-guided RNase function. C2c2 from the bacterium Leptotrichia shahii provides interference against RNA phage. In vitro biochemical analysis show that C2c2 is guided by a single crRNA and can be programmed to cleave ssRNA targets carrying complementary protospacers. In bacteria, C2c2 can be programmed to knock down specific mRNAs. Cleavage is mediated by catalytic residues in the two conserved HEPN domains, mutations in which generate catalytically inactive RNA-binding proteins. These results broaden the understanding of CRISPR-Cas systems and suggest that C2c2 can be used to develop new RNA-targeting tools.
International teams of astronomers have used the Atacama Large Millimeter/submillimeter Array (ALMA) to explore the distant corner of the Universe first revealed in the iconic images of the Hubble Ultra Deep Field (HUDF). These new ALMA observations are significantly deeper and sharper than previous surveys at millimetre wavelengths. They clearly show how the rate of star formation in young galaxies is closely related to their total mass in stars. They also trace the previously unknown abundance of star-forming gas at different points in time, providing new insights into the “Golden Age” of galaxy formation approximately 10 billion years ago.
The new ALMA results will be published in a series of papers appearing in the Astrophysical Journal and Monthly Notices of the Royal Astronomical Society. These results are also among those being presented this week at the Half a Decade of ALMA conference in Palm Springs, California, USA.
In 2004 the Hubble Ultra Deep Field images — pioneering deep-field observations with the NASA/ESA Hubble Space Telescope — were published. These spectacular pictures probed more deeply than ever before and revealed a menagerie of galaxies stretching back to less than a billion years after the Big Bang. The area was observed several times by Hubble and many other telescopes, resulting in the deepest view of the Universe to date.
Astronomers using ALMA have now surveyed this seemingly unremarkable, but heavily studied, window into the distant Universe for the first time both deeply and sharply in the millimetre range of wavelengths . This allows them to see the faint glow from gas clouds and also the emission from warm dust in galaxies in the early Universe.
ALMA has observed the HUDF for a total of around 50 hours up to now. This is the largest amount of ALMA observing time spent on one area of the sky so far.
One team led by Jim Dunlop (University of Edinburgh, United Kingdom) used ALMA to obtain the first deep, homogeneous ALMA image of a region as large as the HUDF. This data allowed them to clearly match up the galaxies that they detected with objects already seen with Hubble and other facilities.
This study showed clearly for the first time that the stellar mass of a galaxy is the best predictor of star formation rate in the high redshift Universe. They detected essentially all of the high-mass galaxies  and virtually nothing else.
Jim Dunlop, lead author on the deep imaging paper sums up its importance: “This is a breakthrough result. For the first time we are properly connecting the visible and ultraviolet light view of the distant Universe from Hubble and far-infrared/millimetre views of the Universe from ALMA.”
The second team, led by Manuel Aravena of the Núcleo de Astronomía, Universidad Diego Portales, Santiago, Chile, and Fabian Walter of the Max Planck Institute for Astronomy in Heidelberg, Germany, conducted a deeper search across about one sixth of the total HUDF .
“We conducted the first fully blind, three-dimensional search for cool gas in the early Universe,” said Chris Carilli, an astronomer with the National Radio Astronomy Observatory (NRAO) in Socorro, New Mexico, USA and member of the research team. “Through this, we discovered a population of galaxies that is not clearly evident in any other deep surveys of the sky.” 
Some of the new ALMA observations were specifically tailored to detect galaxies that are rich in carbon monoxide, indicating regions primed for star formation. Even though these molecular gas reservoirs give rise to the star formation activity in galaxies, they are often very hard to see with Hubble. ALMA can therefore reveal the “missing half” of the galaxy formation and evolution process.
"The new ALMA results imply a rapidly rising gas content in galaxies as we look back further in time,” adds lead author of two of the papers, Manuel Aravena (Núcleo de Astronomía, Universidad Diego Portales, Santiago, Chile). “This increasing gas content is likely the root cause for the remarkable increase in star formation rates during the peak epoch of galaxy formation, some 10 billion years ago.”
It sounds like something out of a science fiction novel, but there are many people who think Mars will be the next frontier for human life.
One of the highest profile believers, billionaire tech entrepreneur Elon Musk, revealed an ambitious plan on Tuesday, to start colonizing the Red Planet in the next 10 years.
Musk, who operates electric car company Tesla Motors, is also the founder and lead designer of aerospace company SpaceX, which is now focused on satellite deliveries and unmanned cargo runs to the International Space Station. But, the company is also working on an unmanned Dragon capsule launch for Mars in 2018.
Elon Musk wants a ticket to Mars to cost less than $200,000. It’s an ambitious goal, but then so is establishing a viable human presence on the Red Planet.
On Sept. 27, the SpaceX CEO laid out his blueprint for putting a colony on Mars, which he says would need at least a million people to become self-sustaining. It’s a milestone he says would be feasible within 40 to 100 years from today. Musk imagines fleets of hundreds of ships leaving Earth every few years, when the two planets are at their closest.
The timeline and the scale may sound ludicrous, but the point of the discussion—and SpaceX itself—is to “make Mars seem possible. Make it seem as though it is something we can do in our lifetimes.”
Speaking to a crowd of space fans at the International Astronautical Congress, Musk framed the challenge as one of ensuring the survival of the human species. “History suggests there will be some doomsday event,” he said. “The alternative is to become a space-going civilization and a multi-planet species.”
But he has motivations beyond saving the human race. “[T]he argument I actually find most compelling is that it would be an incredible adventure. I think it would be the most inspiring thing that I could possibly imagine. Life needs to be more than just solving problems every day. You need to wake up and be excited about the future.”
Musk estimated that the first spacecraft could be ready in four years, but he noted that when it comes to timelines, “I’m not the best at this sort of thing.” (His Falcon Heavy rocket is still years behind schedule, and his electric car company, Tesla, has been similarly prone to missing deadlines.)
Part of the solution is scaling up his current rocket technology. Musk showed off a huge, carbon fiber tank the company has built to contain propellant for the rocket that would take people to Mars, and earlier this week he tweeted out the news of the successful test-firing of its engine.
The World Bank has released a new report highlighting the fact that air pollution costs world governments billions upon billions every year and ranks among the leading causes of death worldwide. The estimates — drawn from a number of sources, including the World Health Organization’s most recently completed data sets compiled in 2013 — can for the first time begin to examine the overall welfare cost of air pollution.
Specifically, researches studied the amount of money that world governments must spend on health emergencies, long term illnesses and chronic conditions caused by air pollution. They also took into account missed work and unemployment subsidies. The report finds that, in terms of the economy, the burden is extremely high.
To be sure, some countries come out of this analysis relatively well off. For example, Iceland only loses $3 million of its gross domestic product to air pollution. Given that the country has a relatively small population and a slight industrial profile, that’s probably not that surprising though.
Other countries, like Liberia, performed relatively well despite their low levels of economic development. Several African nations also have low overall air pollution impact costs. Despite mid-to-high populations, infrastructure is comparatively low density in places like Malawi and Zimbabwe, so perhaps this isn’t that surprising either.
It’s when we get to rapidly developing and “developed” nations that the costs really start to mount up. For example, the United States is estimated to lose $45 billion every year due to air pollution, while the UK loses $7.6 billion annually. Germany comes in at $18 billion, though it will be interesting to see how the country’s renewable energy strategy might alter that figure over the coming years.
China, one of the most rapidly developing nations in the world, is estimated to be losing a staggering 10 percent of its overall GDP, while India is not far behind at roughly eight percent.
Financial losses will, however, seem trivial when we look at the potential human cost of air pollution.
The World Bank estimates that global air pollution kills roughly five and a half million people every year, or to put that another way: it will kill one out of every ten people worldwide.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.