Your new post is loading...
NOTE: To subscribe to the RSS feed of Amazing Science, copy http://www.scoop.it/t/amazing-science/rss.xml into the URL field of your browser and click "subscribe".
This newsletter is aggregated from over 1450 news sources:
All my Tweets and Scoop.It! posts sorted and searchable:
You can search through all the articles semantically on my
NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
MOST_READ • 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Members of Genlisea, a genus of carnivorous plants, possess the smallest genomes known in plants. To elucidate genomic evolution in the group as a whole, researchers have now surveyed a wider range of species, and found a new record-holder.
The genus Genlisea (corkscrew plants) belongs to the bladderwort family (Lentubulariaceae), a family of carnivorous plants. Some of the 29 species of Genlisea that have been described possess tiny genome sizes. Indeed, the smallest genome yet discovered among flowering plants belongs to a member of the group. The term 'genome' here refers to all genetic material arranged in a set of individual chromosomes present in each cell of a given species. An international team of researchers, led by Professor Günther Heubl of LMU's Department of Biology, has now explored, for the first time, the evolution of genome size and chromosome number in the genus. Heubl and his collaborators studied just over half the known species of Genlisea, and their findings are reported in the latest issue of the journal Annals of Botany.
"During the evolution of the genus, the genomes of some Genlisea species must have undergone a drastic reduction in size, which was accompanied by miniaturization of chromosomes, but an increase in chromosome number," says Dr. Andreas Fleischmann, a member of Heubl's research group. Indeed, the chromosomes of the corkscrew plants are so minute that they can only just be resolved by conventional light microscopy. With the aid of an ingenious preparation technique, Dr. Aretuza Sousa, a specialist in cytogenetics and cytology at the Institute of Systematic Botany at LMU, was able to visualize the ultrasmall chromosomes of Genlisea species by fluorescence microscopy. Thanks to this methodology, the researchers were able to identify individual chromosomes and determine their number, as well as measuring the total DNA content of the nuclear genomes of selected representatives of the genus.
The LMU researchers also discovered a new record-holder. Genlisea tuberosa, a species that was discovered only recently from Brazil, and was first described by Andreas Fleischmann in collaboration with Brazilian botanists, turns out to have a genome that encompasses only 61 million base pairs (= Mbp; the genome size is expressed as the total number of nucleotide bases found on each of the paired strands of the DNA double helix) Thus G. tuberosa possesses now the smallest plant genome known, beating the previous record by 3 Mbp. Moreover, genome sizes vary widely between different Genlisea species, spanning the range from ~60 to 1700 Mbp.
In a development that holds promise for future magnetic memory and logic devices, researchers with the U.S. Department of Energy (DOE)’s Lawrence Berkeley National Laboratory (Berkeley Lab) and Cornell University successfully used an electric field to reverse the magnetization direction in a multiferroic spintronic device at room temperature. This demonstration, which runs counter to conventional scientific wisdom, points a new way towards spintronics and smaller, faster and cheaper ways of storing and processing data.
“Our work shows that 180-degree magnetization switching in the multiferroic bismuth ferrite can be achieved at room temperature with an external electric field when the kinetics of the switching involves a two-step process,” says Ramamoorthy Ramesh, Berkeley Lab’s Associate Laboratory Director for Energy Technologies, who led this research. “We exploited this multi-step switching process to demonstrate energy-efficient control of a spintronic device.”
Ramesh, who also holds the Purnendu Chatterjee Endowed Chair in Energy Technologies at the University of California (UC) Berkeley, is the senior author of a paper describing this research in Nature. The paper is titled “Deterministic switching of ferromagnetism at room temperature using an electric field.” John Heron, now with Cornell University, is the lead and corresponding author.
“The electrical currents that today’s memory and logic devices rely on to generate a magnetic field are the primary source of power consumption and heating in these devices,” he says. “This has triggered significant interest in multiferroics for their potential to reduce energy consumption while also adding functionality to devices.” To demonstrate the potential technological applicability of their technique, Ramesh, Heron and their co-authors used heterostructures of bismuth ferrite and cobalt iron to fabricate a spin-valve, a spintronic device consisting of a non-magnetic material sandwiched between two ferromagnets whose electrical resistance can be readily changed. X-ray magnetic circular dichroism photoemission electron microscopy (XMCD-PEEM) images showed a clear correlation between magnetization switching and the switching from high-to-low electrical resistance in the spin-valve. The XMCD-PEEM measurements were completed at PEEM-3, an aberration corrected photoemission electron microscope at beamline 11.0.1 of Berkeley Lab’s Advanced Light Source.
Phen-Gen is the first computer analysis software that cross-references a patient’s symptoms and a person’s genome sequence, to better aid doctors in diagnosing diseases. The software was created by a team of scientists at A*STAR’s Genome Institute of Singapore (GIS), led by Dr. Pauline Ng. Results from the research were published in the prestigious journal Nature Methods on 4th August 2014.
Though printing items like chocolate and pizza might be satisfying enough for some, 3D printing still holds a lot of unfulfilled potential. Talk abounds of disrupting manufacturing, changing the face of construction and even building metal components in space. While it is hard not to get a little bit excited by these potentially world-changing advances, there is one domain where 3D printing is already having a real-life impact. Its capacity to produce customized implants and medical devices tailored specifically to a patient's anatomy has seen it open up all kinds of possibilities in the field of medicine, with the year 2014 having turned up one world-first surgery after another. Let's cast our eye over some of the significant, life-changing procedures to emerge in the past year made possible by 3D printing technology.
NASA's Mars Curiosity rover has measured a tenfold spike in methane, an organic chemical, in the atmosphere around it and detected other organic molecules in a rock-powder sample collected by the robotic laboratory's drill.
"This temporary increase in methane—sharply up and then back down—tells us there must be some relatively localized source," said Sushil Atreya of the University of Michigan, Ann Arbor, and Curiosity rover science team. "There are many possible sources, biological or non-biological, such as interaction of water and rock."
Researchers used Curiosity's onboard Sample Analysis at Mars (SAM) laboratory a dozen times in a 20-month period to sniff methane in the atmosphere. During two of those months, in late 2013 and early 2014, four measurements averaged seven parts per billion. Before and after that, readings averaged only one-tenth that level.
Curiosity also detected different Martian organic chemicals in powder drilled from a rock dubbed Cumberland, the first definitive detection of organics in surface materials of Mars. These Martian organics could either have formed on Mars or been delivered to Mars by meteorites.
Organic molecules, which contain carbon and usually hydrogen, are chemical building blocks of life, although they can exist without the presence of life. Curiosity's findings from analyzing samples of atmosphere and rock powder do not reveal whether Mars has ever harbored living microbes, but the findings do shed light on a chemically active modern Mars and on favorable conditions for life on ancient Mars.
"We will keep working on the puzzles these findings present," said John Grotzinger, Curiosity project scientist of the California Institute of Technology in Pasadena (Caltech). "Can we learn more about the active chemistry causing such fluctuations in the amount of methane in the atmosphere? Can we choose rock targets where identifiable organics have been preserved?"
In a study in the journal Neuron, scientists describe a new high data-rate, low-power wireless brain sensor. The technology is designed to enable neuroscience research that cannot be accomplished with current sensors that tether subjects with cabled connections. Experiments in the paper confirm that new capability. The results show that the technology transmitted rich, neuroscientifically meaningful signals from animal models as they slept and woke or exercised.
“We view this as a platform device for tapping into the richness of electrical signals from the brain among animal models where their neural circuit activity reflects entirely volitional and naturalistic behavior, not constrained to particular space,” said Arto Nurmikko, professor of engineering and physics affiliated with the Brown Institute for Brain Science and the paper’s senior and corresponding author. “This enables new types of neuroscience experiments with vast amounts of brain data wirelessly and continuously streamed from brain microcircuits.”
“The brain sensor is opening unprecedented opportunities for the development of neuroprosthetic treatments in natural and unconstrained environments,” said study co-author Grégoire Courtine, a professor at EPFL (École polytechnique fédérale de Lausanne), who collaborated with Nurmikko’s group on the research. To confirm the system performance, the researchers did a series of experiments with rhesus macaques, which walked on a treadmill while the researchers used the wireless system to measure neural signals associated with the brain’s motion commands. They also did another experiment in which animal subjects went through sleep/wake cycles, unencumbered by cables or wires; the data showed distinct patterns related to the different stages of consciousness and the transitions between them.
“We hope that the wireless neurosensor will change the canonical paradigm of neuroscience research, enabling scientists to explore the nervous system within its natural context and without the use of tethering cables,” said co-lead author David Borton. “Subjects are free to roam, forage, sleep, etc., all while the researchers are observing the brain activity. We are very excited to see how the neuroscience community leverages this platform.”
Every so often our Earth encounters a large chunk of space debris which reminds us that our solar system still contains plenty of debris that could potentially have an impact on life on Earth.
While the great bulk of planetary accretion occurs in the first few hundred million years after the birth of a given system, the process never really comes to an end. Most of the objects that make up the tail of this accretion – grains of dust, lumps of ice, and pieces of rock – smash into our atmosphere and ablate harmlessly many kilometres above the ground, visible only as shooting stars. Larger impacts do, however, continue to occur – as illustrated on February 15, 2013, in the Russian city of Chelyabinsk. On that day, with no warning, a small near-Earth asteroid detonated in the atmosphere, and outshone the noon-day sun.
Though the object itself was relatively small, around 20m in diameter, it exploded with sufficient force to shatter windows many kilometres away, damaging more than 7,000 buildings. Amazingly, nobody was killed – but the impact served as a stark reminder of the dangers posed by rocks from space. The longer the timescale we consider, the larger the biggest collision the Earth might experience. A stand-out example is the impact, around 65 million years ago, thought to have contributed to the extinction of the dinosaurs.
Fortunately for us here on Earth, the rate at which such catastrophic impacts occur is relatively low, but this might not be the case in other planetary systems. Thanks to observations carried out at infrared wavelengths, we are now in a position to start categorising the small object populations of other planetary systems. As we do, we are finding that many systems contain far more debris, left over from their formation, than does our own. This gives us an additional tool by which we can assess potentially habitable planets. It should be possible to estimate the impact regimes that they might experience, based on these kind of observations.
Topological quantum computing (TQC) is a newer type of quantum computing that uses "braids" of particle tracks, rather than actual particles such as ions and electrons, as the qubits to implement computations. Using braids has one important advantage: it makes TQCs practically immune to the small perturbations in the environment that cause decoherence in particle-based qubits and often lead to high error rates.
Ever since TQC was first proposed in 1997, experimentally realizing the appropriate braids has been extremely difficult. For one thing, the braids are formed not by the trajectories of ordinary particles, but by the trajectories of exotic quasiparticles (particle-like excitations) called anyons. Also, movements of the anyons must be non-Abelian, a property similar to the non-commutative property in which changing the order of the anyons' movements changes their final tracks. In most proposals of TQC so far, the non-Abelian statistics of the anyons has not been powerful enough, even in theory, for universal TQC.
Now in a new study published in Physical Review Letters, physicists Abolhassan Vaezi at Cornell University and Maissam Barkeshli at Microsoft's research lab Station Q have theoretically shown that anyons tunneling in a double-layer system can transition to an exotic non-Abelian state that contains "Fibonacci" anyons that are powerful enough for universal TQC.
"Our work suggests that some existing experimental setups are rich enough to yield a phase capable of performing 'universal' TQC, i.e., all of the required logical gates for the performance of a quantum computer can be made through the braiding of anyons only," Vaezi told Phys.org. "Since braiding is a topological operation and does not perturb the low-energy physics, the resulting quantum computer is fault-tolerant."
Damage to neural tissue is typically permanent and causes lasting disability in patients, but a new approach has recently been discovered that holds incredible potential to reconstruct neural tissue at high resolution in three dimensions. Research work recently published in the Journal of Neural Engineering demonstrated a method for embedding scaffolding of patterned nanofibers within three-dimensional (3D) hydrogel structures, and it was shown that neurite outgrowth from neurons in the hydrogel followed the nanofiber scaffolding by tracking directly along the nanofibers, particularly when the nanofibers were coated with a type of cell adhesion molecule called laminin. It was also shown that the coated nanofibers significantly enhanced the length of growing neurites, and that the type of hydrogel could significantly affect the extent to which the neurites tracked the nanofibers.
“Neural stem cells hold incredible potential for restoring damaged cells in the nervous system, and 3D reconstruction of neural tissue is essential for replicating the complex anatomical structure and function of the brain and spinal cord,” said Dr. McMurtrey, author of the study and director of the research institute that led this work. “So it was thought that the combination of induced neuronal cells with micropatterned biomaterials might enable unique advantages in 3D cultures, and this research showed that not only can neuronal cells be cultured in 3D conformations, but the direction and pattern of neurite outgrowth can be guided and controlled using relatively simple combinations of structural cues and biochemical signaling factors.”
Since the late 1970's, NASA has been monitoring changes in the Greenland Ice Sheet. Recent analysis of seven years of surface elevation readings from NASA's ICESat satellite and four years of laser and and ice-penetrating radar data from NASA's airborne mission Operation IceBridge shows us how the surface elevation of the ice sheet has changed.
Some 130 million light years away, within the constellation Canis Major, two twinkling spiral galaxies are in the process of colliding, treating our eyes to a dazzling show. The pair—NGC 2207 and IC 2163—has hosted the explosive deaths of three stars as supernovas in the last 15 years, and is also spawning stars at an intense rate. But scientists have become particularly interested in these merging galaxies for another reason: They are home to one of the largest known collections of super bright X-ray objects. These so-called “ultraluminous X-ray sources” have been spotted using NASA’s Chandra X-ray Observatory, which is partly responsible for the lustrous composite image shown above.
Just like the Milky Way, these galaxies are teeming with bright sources of X-rays called X-ray binaries. These are systems in which a normal star is closely orbiting a collapsed star, such as a neutron star or black hole. Strong gravitational forces from the compact star draw material from the normal star, a process known as accretion. And when the material hits the companion star, it is heated to millions of degrees, a process that generates a huge amount of X-rays. While intense, the emission from these systems pales in comparison to that from ultraluminous X-ray sources (ULXs).
As the name suggests, ULXs are exceedingly bright; they emit more radiation in the X-rays than a million suns would at all wavelengths. Although they are not well understood, many believe they could be black holes of approximately 10 solar masses that are collecting, or accreting, material onto a disk, emitting X-rays in an intense beam. ULXs are also extremely rare; our galaxy doesn’t have one, most galaxies don’t, but those that do usually only have one. Between NGC 2207 and IC 2163, however, there are 28.
Nearly 269,000 tons of plastic pollution may be floating in the world's oceans, according to a new study. Microplastic pollution is found in varying concentrations throughout the oceans, but estimates of the global abundance and weight of floating plastics, both micro and macroplastic, lack sufficient data to support them. To better estimate the total number of plastic particles and their weight floating in the world's oceans, scientists from six countries contributed data from 24 expeditions collected over a six-year period from 2007-2013 across all five sub-tropical gyres, coastal Australia, Bay of Bengal, and the Mediterranean Sea.
Microplastic pollution is found in varying concentrations throughout the oceans, but estimates of the global abundance and weight of floating plastics, both micro and macroplastic, lack sufficient data to support them. To better estimate the total number of plastic particles and their weight floating in the world's oceans, scientists from six countries contributed data from 24 expeditions collected over a six-year period from 2007-2013 across all five sub-tropical gyres, coastal Australia, Bay of Bengal, and the Mediterranean Sea. The data included information about microplastics collected using nets and large plastic debris from visual surveys, which were then used to calibrate an ocean model of plastic distribution.
Based on the data and model, the authors of the study estimate a minimum of 5.25 trillion plastic particles weighing nearly 269,000 tons in the world's oceans. Large plastics appear to be abundant near coastlines, degrading into microplastics in the 5 subtropical gyres, and that the smallest microplastics were present in more remote regions, such as the subpolar gyres, which the authors did not expect. The distribution of the smallest microplastics in remote regions of the ocean may suggest that gyres act as 'shredders' of large plastic items into microplastics, after which they eject them across the ocean.
"Our findings show that the garbage patches in the middle of the five subtropical gyres are not the final resting places for the world's floating plastic trash. The endgame for micro-plastic is interactions with entire ocean ecosystems," says Marcus Eriksen, PhD, Director of Research for the 5 Gyres Institute.
NASA's Systems Analysis and Concepts Directorate has issued a report outlining a possible way for humans to visit Venus, rather than Mars—by hovering in the atmosphere instead of landing on the surface. The hovering vehicle, which they call a High Altitude Venus Operational Concept (HAVOC), would resemble a blimp with solar panels on top, and would allow people to do research just 50 kilometers above the surface of the planet.
Most everyone knows that NASA wants to send people to Mars—that planet also gets most of the press. Mars is attractive because it looks more like Earth and is relatively close to us. The surface of Venus on the other hand, though slightly closer, is not so attractive, with temperatures that can melt lead and atmospheric pressure 92 times that of Earth. There's also that thick carbon dioxide atmosphere with sulfuric acid clouds, lots of earthquakes, volcanoes going off and terrifying lightning bolts. So, why would anyone rather go to Venus than Mars? Because of far lower radiation and much better solar energy.
No one wants to go the surface of Venus, at least not anytime soon, instead, researchers at NASA are looking into the possibility of sending people to hover in the sky above the planet, conducting research in a far less dangerous place than even on the surface of Mars. At 50 kilometers up, an HAVOC would experience just one atmosphere of atmospheric pressure and temperatures averaging just 75 degrees Celsius, with radiation levels equivalent to those in Canada. Astronauts on Mars, on the other hand would experience 40 times the amount of radiation typically faced back here on Earth, which suggests they'd have to live deep underground to survive—a problem that scientists have not yet solved. Some are even beginning to wonder about the feasibility of sending humans to the surface of Mars.
The mass extinction event was thought to have paved the way for mammals to dominate, but researchers say many of them died out alongside the dinosaurs. During the Cretaceous period, extinct relatives of living marsupials – such as possums and kangaroos – thrived.
An international team of experts on mammal evolution and mass extinctions has shown that the once-abundant animals – known as metatherian mammals – came close to extinction. A 10-km-wide asteroid struck what is now Mexico at the end of the Cretaceous period, unleashing a global cataclysm of environmental destruction which led to the demise of the dinosaurs.
The study, including the University of Edinburgh scientists, shows that two-thirds of all metatherians living in North America also perished. This included more than 90 per cent of species living in the northern Great Plains of the US, which is the best area in the world for finding latest Cretaceous mammal fossils, researchers said.
Metatherians never recovered their previous diversity, which explains why marsupials are rare today and largely restricted to unusual environments in Australia and South America.
Species that give birth to well-developed live young – known as placental mammals – took full advantage of the metatherians’ demise. Placental mammals – which include many species from mice to men – are ubiquitous across the globe today, researchers said.
“This is a new twist on a classic story. It wasn’t only that dinosaurs died out, providing an opportunity for mammals to reign, but that many types of mammals, such as most metatherians, died out too – this allowed advanced placental mammals to rise to dominance,” said Dr Thomas Williamson from the New Mexico Museum of Natural History and Science.
Researchers reviewed the evolutionary history of metatherians and constructed the most up-to-date family tree for the mammals based on the latest fossil records, allowing them to study extinction patterns in unprecedented detail.
Electrons may be seen as small magnets that also carry a negative electrical charge. On a fundamental level, these two properties are indivisible. However, in certain materials where the electrons are constrained in a quasi one-dimensional world, they appear to split into a magnet and an electrical charge, which can move freely and independently of each other. A longstanding question has been whether or not similar phenomenon can happen in more than one dimension. A team lead by EPFL scientists now has uncovered new evidence showing that this can happen in quasi two-dimensional magnetic materials. Their work is published in Nature Physics.
A strange phenomenon occurs with electrons in materials that are so thin that they can be thought of as being one-dimensional, e.g. nanowires. Under certain conditions, the electrons in these materials can actually split into an electrical charge and a magnet, which are referred to as "fractional particles". An important but still unresolved question in fundamental particle physics is whether this phenomenon could arise and be observed in more dimensions, like two- or three-dimensional systems.
Under temperatures close to absolute zero, electrons bind together to form an exotic liquid that can flow with exactly no friction. While this was previously observed at near-absolute zero temperatures in other materials, this electron liquid can form in cuprates at much higher temperatures that can be reached using liquid nitrogen alone. Consequently, there is currently an effort to find new materials displaying high-temperature superconductivity at room temperature. But understanding how it arises on a fundamental level has proven challenging, which limits the development of materials that can be used in applications. The advances brought by the EPFL scientists now bring support for the theory of superconductivity as postulated by Anderson.
"This work marks a new level of understanding in one of the most fundamental models in physics," says Henrik M. Rønnow. "It also lends new support for Anderson's theory of high-temperature superconductivity, which, despite twenty-five years of intense research, remains one of the greatest mysteries in the discovery of modern materials."
It’s the most basic of ways to find out what something does, whether it’s an unmarked circuit breaker or an unidentified gene — flip its switch and see what happens. New remote-control technology may offer biologists a powerful way to do this with cells and genes. A team at Rensselaer Polytechnic Institute and Rockefeller University is developing a system that would make it possible to remotely control biological targets in living animals — rapidly, without wires, implants, or drugs.
In a technical report published today in the journal Nature Medicine, the team describes successfully using electromagnetic waves to turn on insulin production to lower blood sugar in diabetic mice. Their system couples a natural iron storage particle, ferritin, to activate an ion channel called TRPV1 such that when the metal particle is exposed to a radio wave or magnetic field, it opens the channel, leading to the activation of an insulin-producing gene. Together, the two proteins act as a nano-machine that can be used to trigger gene expression in cells.
“The use of a radiofrequency-driven magnetic field is a big advance in remote gene expression because it is non-invasive and easily adaptable,” said Jonathan S. Dordick, the Howard P. Isermann Professor of Chemical and Biological Engineering and vice president for research at Rensselaer Polytechnic Institute. “You don’t have to insert anything — no wires, no light systems — the genes are introduced through gene therapy. You could have a wearable device that provides a magnetic field to certain parts of the body and it might be used therapeutically for many diseases, including neurodegenerative diseases. It's limitless at this point.”
Other techniques exist for remotely controlling the activity of cells or the expression of genes in living animals. But these have limitations. Systems that use light as an on/off signal require permanent implants or are only effective close to the skin, and those that rely on drugs can be slow to switch on and off.
The new system, dubbed radiogenetics, uses a signal, in this case low-frequency radio waves or a magnetic field, to activate ferritin particles. They, in turn, prompt the opening of TRPV1, which is situated in the membrane surrounding the cell. Calcium ions then travel through the channel, switching on a synthetic piece of DNA the scientists developed to turn on the production of a downstream gene, which in this study was the insulin gene. In an earlier study, the researchers used only radio waves as the “on” signal, but in the current study, they also tested out a related signal – a magnetic field – that could also activate insulin production. They found it had a similar effect as the radio waves.
A Glasgow-based startup is reducing the cost of access to space by offering "satellite kits" that make it easier for space enthusiasts, high schools and universities alike to build a small but functional satellite for as little as US$6,000 and then, thanks to its very small size, to launch for significantly less than the popular CubeSats.
Building a cheap, working satellite is far from easy. The tiny Kickstarter-funded KickSats, released as a secondary payload during SpaceX’s third ISS resupply mission, ran into a technical problem and failed to deploy in time, while the cheap TubeSats, though an interesting concept, have not seen a single launch to date. And although the more proven CubeSats have had more success, they still aren’t exactly affordable (launching a small 3U CubeSat into low Earth orbit will set you back almost $300,000).
As the name suggests, PocketQubes are "pocket-sized" cube-shaped satellites that measure just 5 cm (1.97 in) per side versus CubeSat’s 10 cm (3.94 in). At one eighth the volume and weight of the typical CubeSat, they are much cheaper to send into orbit (launch is approximately $20,000) but still capable of doing interesting things while in low Earth orbit. PocketQubes were first proposed by CubeSat co-inventor Prof. Bob Twiggs of Morehead State University as a way to further cheapen launch costs for universities, and, like CubeSats, they are modular and can be stacked together to create larger craft.
Thanks to a successful launch on a Russian Dnepr-1 rocket in November last year, there are already four PocketQubes currently in orbit, including the still-operational $50SAT, which cost less than $250 in parts and was developed with the help of Prof. Twiggs himself.
But for those who lack the know-how, building your first satellite – even a tiny one – is bound to be an exceedingly complicated and expensive affair. Glasgow-based startup PocketQube Shop, which sells components for the picosatellites, is trying to fill the gap by announcing the introduction of a "PocketQube Kit" that contains the main building blocks for any small budget satellite project.
The kit includes a spacecraft frame, a radio board, an on-board computer and a programmable Labsat development board that can be used to test different electronic boards. Moreover, the kit can interface with third party payloads. With a low (in spacecraft terms) starting price of $5,999 (around £4,000), the kit is targeted at high schools, university students and hobbyists alike.
It will take about 11 trillion gallons of water (42 cubic kilometers) -- around 1.5 times the maximum volume of the largest U.S. reservoir -- to recover from California's continuing drought, according to a new analysis of NASA satellite data.
The finding was part of a sobering update on the state's drought made possible by space and airborne measurements and presented by NASA scientists Dec. 16 at the American Geophysical Union meeting in San Francisco. Such data are giving scientists an unprecedented ability to identify key features of droughts, and can be used to inform water management decisions.
A team of scientists led by Jay Famiglietti of NASA's Jet Propulsion Laboratory in Pasadena, California, used data from NASA's Gravity Recovery and Climate Experiment (GRACE) satellites to develop the first-ever calculation of this kind -- the volume of water required to end an episode of drought.
Earlier this year, at the peak of California's current three-year drought, the team found that water storage in the state's Sacramento and San Joaquin river basins was 11 trillion gallons below normal seasonal levels. Data collected since the launch of GRACE in 2002 show this deficit has increased steadily.
"Spaceborne and airborne measurements of Earth's changing shape, surface height and gravity field now allow us to measure and analyze key features of droughts better than ever before, including determining precisely when they begin and end and what their magnitude is at any moment in time," Famiglietti said. "That's an incredible advance and something that would be impossible using only ground-based observations."
GRACE data reveal that, since 2011, the Sacramento and San Joaquin river basins decreased in volume by four trillion gallons of water each year (15 cubic kilometers). That's more water than California's 38 million residents use each year for domestic and municipal purposes. About two-thirds of the loss is due to depletion of groundwater beneath California's Central Valley.
In related results, early 2014 data from NASA's Airborne Snow Observatory indicate that snowpack in California's Sierra Nevada range was only half of previous estimates. The observatory is providing the first-ever high-resolution observations of the water volume of snow in the Tuolumne River, Merced, Kings and Lakes basins of the Sierra Nevada and the Uncompahgre watershed in the Upper Colorado River Basin.
Researchers from North Carolina State University have developed a new lithography technique that uses nanoscale spheres to create three-dimensional (3-D) structures with biomedical, electronic and photonic applications. The new technique is significantly less expensive than conventional methods and does not rely on stacking two-dimensional (2-D) patterns to create 3-D structures.
“Our approach reduces the cost of nanolithography to the point where it could be done in your garage,” says Dr. Chih-Hao Chang, an assistant professor of mechanical and aerospace engineering at NC State and senior author of a paper on the work.
Most conventional lithography uses a variety of techniques to focus light on a photosensitive film to create 2-D patterns. These techniques rely on specialized lenses, electron beams or lasers – all of which are extremely expensive. Other conventional techniques use mechanical probes, which are also costly. To create 3-D structures, the 2-D patterns are essentially printed on top of each other. The NC State researchers took a different approach, placing nanoscale polystyrene spheres on the surface of the photosensitive film.
The nanospheres are transparent, but bend and scatter the light that passes through them in predictable ways according to the angle that the light takes when it hits the nanosphere. The researchers control the nanolithography by altering the size of the nanosphere, the duration of light exposures, and the angle, wavelength and polarization of light. The researchers can also use one beam of light, or multiple beams of light, allowing them to create a wide variety of nanostructure designs.
“We are using the nanosphere to shape the pattern of light, which gives us the ability to shape the resulting nanostructure in three dimensions without using the expensive equipment required by conventional techniques,” Chang says. “And it allows us to create 3-D structures all at once, without having to make layer after layer of 2-D patterns.”
The researchers have also shown that they can get the nanospheres to self-assemble in a regularly-spaced array, which in turn can be used to create a uniform pattern of 3-D nanostructures.
“This could be used to create an array of nanoneedles for use in drug delivery or other applications,” says Xu Zhang, a Ph.D. student in Chang’s lab and lead author of the paper.
For decades, the mantra of electronics has been smaller, faster, cheaper. Today, Stanford engineers add a fourth word - taller. At a conference in San Francisco, a Stanford team will reveal how to build high-rise chips that could leapfrog the performance of the single-story logic and memory chips on today's circuit cards.
Those circuit cards are like busy cities in which logic chips compute and memory chips store data. But when the computer gets busy, the wires connecting logic and memory can get jammed. The Stanford approach would end these jams by building layers of logic atop layers of memory to create a tightly interconnected high-rise chip. Many thousands of nanoscale electronic "elevators" would move data between the layers much faster, using less electricity, than the bottle-neck prone wires connecting single-story logic and memory chips today.
The work is led by Subhasish Mitra, a Stanford professor of electrical engineering and computer science, and H.-S. Philip Wong, the Williard R. and Inez Kerr Bell Professor in Stanford's School of Engineering. They describe their new high-rise chip architecture in a paper being presented at the IEEE International Electron Devices Meeting on Dec. 15-17. The researchers' innovation leverages three breakthroughs.
The first is a new technology for creating transistors, those tiny gates that switch electricity on and off to create digital zeroes and ones. The second is a new type of computer memory that lends itself to multi-story fabrication. The third is a technique to build these new logic and memory technologies into high-rise structures in a radically different way than previous efforts to stack chips.
"This research is at an early stage, but our design and fabrication techniques are scalable," Mitra said. "With further development this architecture could lead to computing performance that is much, much greater than anything available today."
Unlike in mathematics, it is rare to have exact solutions to physics problems.
"When they do present themselves, they are an opportunity to test the approximation schemes (algorithms) that are used to make progress in modern physics," said Michael Strickland, Ph.D., associate professor of physics at Kent State University. Strickland and four of his collaborators recently published an exact solution in the journal Physical Review Letters that applies to a wide array of physics contexts and will help researchers to better model galactic structure, supernova explosions and high-energy particle collisions, such as those studied at the Large Hadron Collider at CERN in Switzerland. In these collisions, experimentalists create a short-lived high-temperature plasma of quarks and gluons called quark gluon plasma (QGP), much like what is believed to be the state of the universe milliseconds after the Big Bang 13.8 billion years ago.
In their article, Strickland and co-authors Gabriel S. Denicol of McGill University, Ulrich Heinz and Mauricio Martinez of the Ohio State University, and Jorge Noronha of the University of São Paulo presented the first exact solution that describes a system that is expanding at relativistic velocities radially and longitudinally.
The equation that was solved was invented by Austrian physicist Ludwig Boltzmann in 1872 to model the dynamics of fluids and gases. This equation was ahead of its time since Boltzmann imagined that matter was atomic in nature and that the dynamics of the system could be understood solely by analyzing collisional processes between sets of particles.
"In the last decade, there has been a lot of work modeling the evolution of the quark gluon plasma using hydrodynamics in which the QGP is imagined to be fluidlike," Strickland said. "As it turns out, the equations of hydrodynamics can be obtained from the Boltzmann equation and, unlike the hydrodynamical equations, the Boltzmann equation is not limited to the case of a system that is in (or close to) thermal equilibrium.
"Both types of expansion occur in relativistic heavy ion collisions, and one must include both if one hopes to make a realistic description of the dynamics," Strickland continued. "The new exact solution has both types of expansion and can be used to tell us which hydrodynamical framework is the best."
The abstract for this article can be found at journals.aps.org/prl/abstract/… ysRevLett.113.202301.
The International Ocean Discovery Program (IODP) found microbes living 2,400m beneath the seabed off Japan. The tiny, single-celled organisms survive in this harsh environment on a low-calorie diet of hydrocarbon compounds and have a very slow metabolism. The findings are being presented at the America Geophysical Union Fall Meeting.
Elizabeth Trembath-Reichert, from the California Institute of Technology, who is part of the team that carried out the research, said: "We keep looking for life, and we keep finding it, and it keeps surprising us as to what it appears to be capable of." The IODP Expedition 337 took place in 2012 off the coast of Japan’s Shimokita Peninsula in the northwestern Pacific. From the Chikyu ship, a monster drill was set down more than 1,000m (3,000ft) beneath the waves, where it penetrated a record-breaking 2,446m (8,024ft) of rock under the seafloor. Samples were taken from the ancient coal bed system that lies at this depth, and were returned to the ship for analysis.
The team found that microbes, despite having no light, no oxygen, barely any water and very limited nutrients, thrived in the cores. To find out more about how this life from the "deep biosphere" survives, the researchers set up a series of experiments in which they fed the little, spherical organisms different compounds. Dr Trembath-Reichert said: "We chose these coal beds because we knew there was carbon, and we knew that this carbon was about as tasty to eat, when it comes to coal, as you could get for microbes. "The thought was that while there are some microbes that can eat compounds in coal directly, there may be smaller organic compounds – methane and other types of hydrocarbons - sourced from the coal that the microbes could eat as well."
The experiments revealed that the microbes were indeed dining on these methyl compounds. The tests also showed that the organisms lived life in the slow lane, with an extremely sluggish metabolism.
MIT researchers have discovered a new mathematical relationship — between material thickness, temperature, and electrical resistance — that appears to hold in all superconductors. They describe their findings in the latest issue of Physical Review B.
“We were able to use this knowledge to make larger-area devices, which were not really possible to do previously, and the yield of the devices increased significantly,” says Yachin Ivry, a postdoc in MIT’s Research Laboratory of Electronics, and the first author on the paper. Ivry works in the Quantum Nanostructures and Nanofabrication Group, which is led by Karl Berggren, a professor of electrical engineering and one of Ivry’s co-authors on the paper. Among other things, the group studies thin films of superconductors.
Superconductors are materials that, at temperatures near absolute zero, exhibit no electrical resistance; this means that it takes very little energy to induce an electrical current in them. A single photon will do the trick, which is why they’re useful as quantum photodetectors. And a computer chip built from superconducting circuits would, in principle, consume about one-hundredth as much energy as a conventional chip.
“Thin films are interesting scientifically because they allow you to get closer to what we call the superconducting-to-insulating transition,” Ivry says. “Superconductivity is a phenomenon that relies on the collective behavior of the electrons. So if you go to smaller and smaller dimensions, you get to the onset of the collective behavior.”
By 2040, the world’s energy supply mix will be divided into nearly four equal parts; Oil, gas, coal and low-carbon sources—nuclear and renewables—according to the International Energy Agency’s (IEA) 2014 World Energy Outlook. The assessment by the IEA finds that under current and planned policies, the average temperature will also increase by 3.6 degrees Celsius by 2100. Renewable energy takes a far greater role in new electricity supply in the near future—expanding from about 1700 gigawatts today to 4550 gigawatts in 2040—but it is not enough to offset the global dominance of fossil fuels.
“As our global energy system grows and transforms, signs of stress continue to emerge,” IEA Executive Director Maria van der Hoeven, said in a statement. “But renewables are expected to go from strength to strength, and it is incredible that we can now see a point where they become the world’s number one source of electricity generation.”
Renewable energy production will double as a share of world electricity demand by 2040, according to the report. But that still does not dethrone coal in electricity generation. Coal will simply shift regionally from the United States and China to Southeast Asia and India, according to the EIA.
The least attractive piece of all, energy efficiency, is poised to be a winner in coming decades and could have an even greater impact if some of the world’s largest energy users carry through with proposed efficiency plans. Efficiency measures are set to halve the global growth in energy demand from 2 percent annually to about 1 percent beginning in 2025, according to the IEA.
Efficiency standards for cars and more stringent energy efficiency targets for industry and everyday devices are key to slowing the demand for energy, but they do not necessarily help diminish the world’ reliance on fossil fuels because the true price of fossil fuels are not acurately reflected in the price people pay in some regions.
Fossil fuels receive about $550 billion in subsidies in 2013, compared to $120 billion for all renewable energies. Although the fossil fuel subsidies were $25 billion lower than 2012, there is still vast room for improvement to end price breaks for the mature industries, especially in gas and oil-rich nations, which offer the bulk of the subsidies.
When early humans discovered how to purposefully create fire and make the most of it for their survival, it was a feat comparable to modern day milestones of sending men to the moon, but while the mastery of fire is hailed as among the most crucial developments in human history and evolution, researchers are not certain when this happened.
Some anthropologists believe that early humans started to exploit fire as early as 1.5 million years ago, but much of the evidence supporting this claim such as the heated clays and charcoal fragments is disputed because they can be attributed to natural bush fires. Many experts also think that the early uses of fire were opportunistic, meaning early humans used natural bush fires instead of starting the fire themselves.
A group of archeologists studying artifacts from an ancient cave, however, claims to have figured out when humans learned to master fire. For their study published in the journal Science on Oct. 19, Ron Shimelmitz, from the Zinman Institute of Archaeology of the University of Haifa in Israel, and colleagues examined artifact, most of which were flint tools and debris excavated from Israel's Tabun Cave. The archeological site, which was declared as having universal value by UNESCO two years ago, documents half a million years of human history and provided the researchers with the opportunity to study how the use of fire evolved in the cave.
By examining the cave's sediment layers, the researchers found that most of the flints were not burned in layers that were older than 350,000 years old. Burned-up flints, however, started to show up more regularly after this with most of the flints characterized by cracking, red or black coloration, and small round depressions where fragments called pot lids flaked off the stone, indicating exposure to fire.
Shimelmitz and colleagues said that while fire had been in use for a long time, it took a while before humans learned how to control and start it with the study indicating that habitual use of fire in Israel's Tabun Cave started just between 350,000-320,000 years ago. "While hominins may have used fire occasionally, perhaps opportunistically, for some million years, we argue here that it only became a consistent element in behavioral adaptations during the second part of the Middle Pleistocene," the researchers wrote.