NOTE: All articles in the amazing-science newsletter can also be sorted by topic. To do so, click the FIND buntton (symbolized by the FUNNEL on the top right of the screen) and display all the relevant postings SORTED by TOPICS.
You can also type your own query:
e.g., you are looking for articles involving "dna" as a keyword
In a tiny quantum prison, electrons behave quite differently as compared to their counterparts in free space. They can only occupy discrete energy levels, much like the electrons in an atom -- for this reason, such electron prisons are often called "artificial atoms." Artificial atoms may also feature properties beyond those of conventional ones, with the potential for many applications for example in quantum computing. Such additional properties have now been shown for artificial atoms in the carbon material graphene. The results have been published in the journal Nano Letters, the project was a collaboration of scientists from TU Wien (Vienna, Austria), RWTH Aachen (Germany) and the University of Manchester (GB).
"Artificial atoms open up new, exciting possibilities, because we can directly tune their properties," says Professor Joachim Burgdörfer (TU Wien, Vienna). In semiconductor materials such as gallium arsenide, trapping electrons in tiny confinements has already been shown to be possible. These structures are often referred to as "quantum dots." Just like in an atom, where the electrons can only circle the nucleus on certain orbits, electrons in these quantum dots are forced into discrete quantum states.
Even more interesting possibilities are opened up by using graphene, a material consisting of a single layer of carbon atoms, which has attracted a lot of attention in the last few years.
"In most materials, electrons may occupy two different quantum states at a given energy. The high symmetry of the graphene lattice allows for four different quantum states. This opens up new pathways for quantum information processing and storage" explains Florian Libisch from TU Wien. However, creating well-controlled artificial atoms in graphene turned out to be extremely challenging.
Hydrogen is the first chemical element listed in the periodic table and has the atomic number of 1. Up until now, hydrogen has usually been classed as a gas or liquid, but scientists are now extremely close to producing the first ever sample of solid metallic hydrogen. Their progress so far has been achieved through the use of powerful lasers, electrical impulses, and other state-of-the-art equipment.
As research continues, scientists uncover more about the properties of hydrogen and how it can benefit us as humans. It is a simple element that has the ability to change phases based on temperature and pressure and could be the next big superconductor. Others that are currently used on MRIs and the Large Hadron Collider only work when they are cooled to extremely low temperatures. But with metallic hydrogen, it has the potential to act as a superconductor just at room temperature.
Various techniques and experiments are being conducted at the moment to try and be the first to produce metallic hydrogen successfully. One of the ways that are being trialed involves a diamond anvil. Here two tapered diamonds are used to exert intense pressure onto the sample of hydrogen, but this achieves the fourth phase of solid hydrogen, not metallic hydrogen, so more work is still needed here. Others have begun to use lasers to blast samples of hydrogen, therefore increasing the pressure and temperature temporarily.
Although all of these experiments showed some evidence of metallic behavior, it was only liquid metal. Other techniques involve using intense bursts of electrical power, as the Z Machine at Sandia National Laboratories, to force a metal plate into the hydrogen samples. Scientists are confident that metallic hydrogen exists in the solar system, it is now just a case of reproducing it.
The effort will use next-generation cell-culture methods and fresh patient samples.
An international collaboration of cancer-research heavy-weights aims to grow 1,000 new cell lines for scientists to study — and that could be just the beginning. The Human Cancer Models Initiative announced its pilot project on July 11, and intends to complete the initial 1,000 models within 3 years. Members of the initiative include the US National Cancer Institute (NCI) in Bethesda, Maryland; Cancer Research UK in London; the Wellcome Trust Sanger Institute in Hinxton, UK; and Hubrecht Organoid Technology of Utrecht in the Netherlands.
The initial goal of 1,000 cell lines would roughly double the world’s collection of accessible cancer cell models, says Louis Staudt, head of the NCI’s Center for Cancer Genomics. But if all goes well during the pilot, the project will generate thousands more. Staudt estimates that researchers need about 10,000 models to fully capture the diversity of relatively common genetic subtypes of cancer. “Whether we actually will push into that depends a lot upon how easy and valuable the cell lines are from the pilot,” he says.
The initiative’s models will offer several improvements over most available cell lines. Each line will be matched with clinical data about the donor patient, and how they responded to treatment. The project will also use cutting-edge techniques to generate its models, which will include 3D cultures called organoids, and cells that have been reprogrammed to grow indefinitely in culture.
The hope is that these features will better reflect human cancers, enabling the cells to be used to model disease, screen for new drugs and determine which treatments are suited for which cancers.
A new concept could bring highly efficient solar power by combining three types of technologies that convert different parts of the light spectrum and also store energy for use after sundown.
Combining the technologies could make it possible to harness and store far more of the spectrum of sunlight than is possible using any one of the technologies separately.
"Harvesting the full spectrum of sunlight using a hybrid approach offers the potential for higher efficiencies, lower power production costs, and increased power grid compatibility than any single technology by itself," said Peter Bermel, an assistant professor in Purdue University's School of Electrical and Computer Engineering. "The idea is to use technologies that, for the most part exist now, but to combine them in a creative way that allows us to get higher efficiencies than we normally would."
The approach combines solar photovoltaic cells, which convert visible and ultraviolet light into electricity, thermoelectric devices that convert heat into electricity, and steam turbines to generate electricity. The thermoelectric devices and steam turbines would be driven by heat collected and stored using mirrors to focus sunlight onto a newly designed "selective solar absorber and reflector."
"This is a spectrally selective system, so it is able to efficiently make use of as much of the spectrum as possible," he said. "The thermal storage allows for significant flexibility in the time of power generation, so the system can produce power for hours after sunset, providing a consistent source of power throughout the day."
Findings from the research are detailed in a paper with an advance online publication date of Aug. 15, and the paper is scheduled to appear in a future print issue of the journal Energy & Environmental Science.
A new class of substances is effective against both the AIDS pathogen, HIV, and antibiotics-resistant MRSAbacteria. These two pathogensoften occur together. Scientists hope that it may be possible to control them with a single drug in the future. Scientists of the Helmholtz Institute for Pharmaceutical Research Saarland (HIPS) developed so-called dual agents that inhibit the growth of both types of pathogens. They describe their findings in the renowned Journal of Medicinal Chemistry. The HIPS is the Saarbrücken branch of the Helmholtz Centre for Infection Research (HZI), which has its headquarters in Braunschweig. It was founded jointly by the HZI and Saarland University in 2009.
The human immunodeficiency virus HIV is one of the most dangerous and widespreadpathogens throughout the world. Some 37 million people are host to the virus and 1.2 million were killed by this disease in 2014 alone. Meanwhile, both the proliferation of the pathogen and the progression of the disease can be halted through a combination therapy, but the viruses show an increasing trend to develop resistance and no longer respond to the medications used against them.
The notorious MRSAbacteria, i.e. methicillin-resistant Staphylococcus aureus strains, show similar persistence as many common antibiotics have become ineffective. HIV patients, whose immune systemhas already been weakened by the disease, are often additionally afflicted by MRSApathogens. These co-infections are very problematic and difficult to treat. "Resistance to the common therapies is quite widespread amongst both the viruses and the MRSAbacteria, which means that the co-infection is very difficult to control," explains HZI scientist Prof Rolf Hartmann, who is the head of the "Drug Design and Optimization" department at the HIPS. "In addition, it is necessary to carefully consider the interactions between the medications given to the patients."
Development of tools to identify species, genotypes, or novel strains of invasive organisms is critical for monitoring emergence and implementing rapid response measures. Molecular markers, although critical to identifying species or genotypes, require bioinformatic tools for analysis. However, user-friendly analytical tools for fast identification are not readily available.
To address this need, we created a web-based set of applications called Microbe-ID that allow for customizing a toolbox for rapid species identification and strain genotyping using any genetic markers of choice. Two components of Microbe-ID, named Sequence-ID and Genotype-ID, implement species and genotype identification, respectively. Sequence-ID allows identification of species by using BLAST to query sequences for any locus of interest against a custom reference sequence database.
Genotype-ID allows placement of an unknown multilocus marker in either a minimum spanning network or dendrogram with bootstrap support from a user-created reference database.
Microbe-ID can be used for identification of any organism based on nucleotide sequences or any molecular marker type and several examples are provided. We created a public website for demonstration purposes called Microbe-ID (microbe-id.org) and provided a working implementation for the genus Phytophthora (phytophthora-id.org).
In Phytophthora-ID, the Sequence-ID application allows identification based on ITS or cox spacer sequences. Genotype-ID groups individuals into clonal lineages based on simple sequence repeat (SSR) markers for the two invasive plant pathogen species P. infestans and P. ramorum. All code is open source and available on github and CRAN. Instructions for installation and use are provided at https://github.com/grunwaldlab/Microbe-ID.
Researchers at the U.S. Department of Energy's (DOE) Ames Laboratory have discovered an unusual property of purple bronze that may point to new ways to achieve high temperature superconductivity. While studying purple bronze, a molybdenum oxide, researchers discovered an unconventional charge density wave on its surface.
A charge density wave (CDW) is a state of matter where electrons bunch together in a repeating pattern, like a standing wave of surface of water. Superconductivity and charge density waves share a common origin, often co-exist, and can compete for dominance in certain materials.
Conventional CDWs and superconductivity both arise from electron-phonon interactions, the interaction of electrons with the vibrations of the crystal lattice. Electron-electron interactions are the likely origin of unconventional, high-temperature superconductivity such as found in copper- and iron-based compounds.
Unconventional, electron-electron driven CDW are extremely rare and its discovery here is important, because the material showed an 'extraordinary' increase of CDW transition temperature from 130K (-143C) to 220K (-53C) and a huge increase of energy gap at the surface.
Both are properties essential for CDW and high-temperature superconductivity, explained Adam Kaminski, Ames Laboratory scientist and professor in the Department of Physics and Astronomy at Iowa State University.
"This was an accidental but very exciting discovery," said Kaminski. "We were studying this material because its one-dimensional structure makes it quite interesting. We saw strange things happening to the electronic band structure, but when we looked at the surface we were stunned by extraordinary enhancement of transition temperature and energy gap."
The world’s largest cloning plant is expected to clone 1 million cattle annually and is currently under construction in China.
Imagine one million identical cows marching shoulder to shoulder and rib to rib down a path to the slaughterhouse. Imagine one million identical cows getting the same idea simultaneously to turn around and storm the cloning plant that created them. Somewhere in between is what probably will happen when the world’s largest cloning plant, currently under construction in China, goes into full operation in early 2016.
Plant where 1 million cows will be cloned annually
The $31 million plant is being built in Tianjin (160 km (100 miles) from Beijing) by BoyaLife, a three-year-old biotech firm specializing in stem cell and regenerative medicine, biological products, drug innovation and hereditary diseases research. The plan for the 14,000 square meter (150,000 sq. ft.) plant is to produce 100,000 cloned cattle embryos the first year and ramp quickly up to a million annually to satisfy China’s rapidly-growing demand for beef.
In addition to cattle, the company will clone pet dogs, police dogs, racehorses and “non-human primates,” with the somewhat altruistic goal of being the first to someday clone endangered pandas.
Researchers at the Faculty of Physics at the University of Warsaw, using the liquid crystal elastomer technology, originally developed in the LENS Institute in Florence, demonstrated a bioinspired micro-robot capable of mimicking caterpillar gaits in natural scale. The 15-millimeter long soft robot harvests energy from green light and is controlled by spatially modulated laser beam. Apart from travelling on flat surfaces, it can also climb slopes, squeeze through narrow slits and transport loads.
For decades scientists and engineers have been trying to build robots mimicking different modes of locomotion found in nature. Most of these designs have rigid skeletons and joints driven by electric or pneumatic actuators. In nature, however, a vast number of creatures navigate their habitats using soft bodies - earthworms, snails and larval insects can effectively move in complex environments using different strategies. Up to date, attempts to create soft robots were limited to larger scale (typically tens of centimeters), mainly due to difficulties in power management and remote control.
Liquid Crystalline Elastomers (LCEs) are smart materials that can exhibit large shape change under illumination with visible light. With the recently developed techniques, it is possible to pattern these soft materials into arbitrary three dimensional forms with a pre-defined actuation performance. The light-induced deformation allows a monolithic LCE structure to perform complex actions without numerous discrete actuators.
Researchers from the University of Warsaw with colleagues from LESN (Italy) and Cambridge (UK) have now developed a natural-scale soft caterpillar robot with an opto-mechanical liquid crystalline elastomer monolithic design. The robot body is made of a light sensitive elastomer stripe with patterned molecular alignment. By controlling the travelling deformation pattern the robot mimics different gaits of its natural relatives. It can also walk up a slope, squeeze through a slit and push objects as heavy as ten times its own mass, demonstrating its ability to perform in challenging environments and pointing at potential future applications.
“The Arctic sea ice responded very rapidly to past climate changes. During the coldest periods of the past 90,000 years the sea ice edge spread relatively quickly to the Greenland-Scotland Ridge, and probably far into the Atlantic Ocean.” says Ulrike Hoff, a researcher at Centre for Arctic Gas Hydrate, Environment and Climate (CAGE).
Sea ice amplifies the climate changes that are occurring at any given time. Its growth and melting has profound effects on climate, the marine environment and ocean circulation.
Hoff and colleagues studied the past distribution of sea ice, in the so far longest existing sea ice record in a marine sediment core. The core was retrieved from 1200m water depth from the ocean floor of the Nordic Seas, just off the Faroe Islands. The core represents 90,000 years of sediment layers, and it is by studying those layers that scientist can reveal the changes in sea ice and past climate.
It was the tiniest of evidence in these layers that brought this strong confirmation of sea ice behavior to light. They are a type of phytoplankton, called diatoms, and they are everywhere around you. Diatoms are single celled algae with a cell wall made up of silica.
“They are the golden brown coating in the glass of a street lamp, and shiny stuff in your make-up. They are even used in tooth paste as a cleaning agent.,” says Hoff. “Diatoms are truly amazing, and can be preserved in marine and lake sediments for millions of years. I have personally examined diatom fossils that are 65 million of years old, and they look much the same as the diatoms that we find living today.”
In the future, level-tuned neurons may help enable neuromorphic computing systems to perform tasks that traditional computers cannot, such as learning from their environment, pattern recognition, and knowledge extraction from big data sources.
The researchers, Angeliki Pantazi et al., at IBM Research-Zurich and École Polytechnique Fédérale de Lausanne, both in Switzerland, have published a paper on the new neuromorphic architecture in a recent issue of Nanotechnology.
Like all neuromorphic computing architectures, the proposed system is based on neurons and their synapses, which are the junctions where neurons send signals to each other. In this study, the researchers physically implemented artificial neurons using phase-change materials. These materials have two stable states: a crystalline, low-resistivity state and an amorphous, high-resistivity state. Just as in traditional computing, the states can be switched by the application of a voltage. When the neuron's conductance reaches a certain threshold, the neuron fires.
"We have demonstrated that phase-change-based memristive devices can be used to create artificial neurons and synapses to store and process data," coauthor Evangelos Eleftheriou at IBM Research-Zurich explains. "A phase-change neuron uses the phase configuration of the phase-change material to represent its internal state, the membrane potential. For the phase-change synapse, the synaptic weight—which is responsible for the plasticity—is encoded by the conductance of the nanodevice."
In this architecture, each neuron is tuned to a specific range, or level. Neurons receive signals from many other neurons, and a level is defined as the cumulative contribution of the sum of these incoming signals.
A collaboration of physicists and a mathematician has made a significant step toward unifying general relativity and quantum mechanics by explaining how spacetime emerges from quantum entanglement in a more fundamental theory.
Physicists and mathematicians have long sought a Theory of Everything (ToE) that unifies general relativity and quantum mechanics. General relativity explains gravity and large-scale phenomena such as the dynamics of stars and galaxies in the universe, while quantum mechanics explains microscopic phenomena from the subatomic to molecular scales.
The holographic principle is widely regarded as an essential feature of a successful Theory of Everything. The holographic principle states that gravity in a three-dimensional volume can be described by quantum mechanics on a two-dimensional surface surrounding the volume. In particular, the three dimensions of the volume should emerge from the two dimensions of the surface. However, understanding the precise mechanics for the emergence of the volume from the surface has been elusive.
The paper announcing the discovery by Hirosi Ooguri, a Principal Investigator at the University of Tokyo's Kavli IPMU, with Caltech mathematician Matilde Marcolli and graduate students Jennifer Lin and Bogdan Stoica, will be published in Physical Review Letters as an Editors' Suggestion "for the potential interest in the results presented and on the success of the paper in communicating its message, in particular to readers from other fields."
Now, Ooguri and his collaborators have found that quantum entanglement is the key to solving this question. Using a quantum theory (that does not include gravity), they showed how to compute energy density, which is a source of gravitational interactions in three dimensions, using quantum entanglement data on the surface. This is analogous to diagnosing conditions inside of your body by looking at X-ray images on two-dimensional sheets. This allowed them to interpret universal properties of quantum entanglement as conditions on the energy density that should be satisfied by any consistent quantum theory of gravity, without actually explicitly including gravity in the theory. The importance of quantum entanglement has been suggested before, but its precise role in emergence of spacetime was not clear until the new paper by Ooguri and collaborators.
Quantum entanglement is a phenomenon whereby quantum states such as spin or polarization of particles at different locations cannot be described independently. Measuring (and hence acting on) one particle must also act on the other, something that Einstein called "spooky action at distance." The work of Ooguri and collaborators shows that this quantum entanglement generates the extra dimensions of the gravitational theory.
Associate Professor Dr Joan Vaccaro, of Griffith's Centre for Quantum Dynamics, has solved an anomaly of conventional physics and shown that a mysterious effect called 'T violation' could be the origin of time evolution and conservation laws.
"I begin by breaking the rules of physics, which is rather bold I have to admit, but I wanted to understand time better and conventional physics can't do that," Dr Vaccaro says.
"I do get conventional physics in the end though. This means that the rules I break are not fundamental. It also means that I can see why the universe has those rules. And I can also see why the universe advances in time."
In her research published in The Royal Society Dr Vaccaro says T violation, or a violation of time reversal (T) symmetry, is forcing the universe and us in it, into the future. "If T violation wasn't involved we wouldn't advance in time and we'd be stuck at the Big Bang, so this shows how we escaped the Big Bang.
"I found the mechanism that forces us to go to the future, the reason why you get old and the reason why we advance in time." "The universe must be symmetric in time and space overall. But we know that there appears to be a preferred direction in time because we are incessantly getting older not younger."
The anomaly Dr Vaccaro solves involves two things not accounted for in in conventional physical theories -- the direction of time, and the behavior of the mesons, which decay differently if time went in the opposite direction.
Experiments show that the behavior of mesons depends on the direction of time; in particular, if the direction of time was changed then their behavior would also," she says.
"Conventional physical theories can accommodate only one direction of time and one kind of meson behavior, and so they are asymmetric in this regard. But the problem is that the universe cannot be asymmetric overall.
"This means that physical theories must be symmetric in time. To be symmetric in time they would need to accommodate both directions of time and both meson behaviors. This is the anomaly in physics that I am attempting to solve."
Dr Vaccaro is presenting her work at the Soapbox Science event held in Brisbane as part of National Science Week, titled "The meaning of time: why the universe didn't stay put at the big bang and how it is 'now' and no other time."
Without any T violation the theory gives a very strange universe. An object like a cup can be placed in time just like it is in space.
"It just exists at one place in space and one point in time. There is nothing unusual about being at one place in space, but existing at one point in time means the object would come into existence only at that point in time and then disappear immediately.
"This means that conservation of matter would be violated. It also means that there would be no evolution in time. People would only exist for a single point in time -- they would not experience a "flow of time."
When Dr Vaccaro adds T violation to the theory, things change dramatically. "The cup is now found at any and every time," she says,
"This means that the theory now has conservation of matter -- the conservation has emerged from the theory rather than being assumed. Moreover, objects change over time, cups chip and break, and people would grow old and experience a "flow of time." This means that the theory now has time evolution.
Scientists from the Institut Pasteur have demonstrated the role of lysosomal vesicles in transporting α-synuclein aggregates, responsible for Parkinson's and other neurodegenerative diseases, between neurons. These proteins move from one neuron to the next in lysosomal vesicles which travel along the "tunneling nanotubes" between cells. These findings were published in The EMBO Journal on Aug. 22, 2016.
Synucleinopathies, a group of neurodegenerative diseases including Parkinson's disease, are characterized by the pathological deposition of aggregates of the misfolded α-synuclein protein into inclusions throughout the central and peripheral nervous system. Intercellular propagation (from one neuron to the next) of α-synuclein aggregates contributes to the progression of the neuropathology, but little was known about the mechanism by which spread occurs.
In this study, scientists from the Membrane Traffic and Pathogenesis Unit, directed by Chiara Zurzolo at the Institut Pasteur, used fluorescence microscopy to demonstrate that pathogenic α-synuclein fibrils travel between neurons in culture, inside lysosomal vesicles through tunneling nanotubes (TNTs), a new mechanism of intercellular communication.
After being transferred via TNTs, α-synuclein fibrils are able to recruit and induce aggregation of the soluble α-synuclein protein in the cytosol of cells receiving the fibrils, thus explaining the propagation of the disease. The scientists propose that cells overloaded with α-synuclein aggregates in lysosomes dispose of this material by hijacking TNT-mediated intercellular trafficking. However, this results in the disease being spread to naive neurons.
This study demonstrates that TNTs play a significant part in the intercellular transfer of α-synuclein fibrils and reveals the specific role of lysosomes in this process. This represents a major breakthrough in understanding the mechanisms underlying the progression of synucleinopathies.
These compelling findings, together with previous reports from the same team, point to the general role of TNTs in the propagation of prion-like proteins in neurodegenerative diseases and identify TNTs as a new therapeutic target to combat the progression of these incurable diseases.
From the most dramatic moment in life – the day of your birth – to first steps, first words, first food, right up to nursery school, most of us can’t remember anything of our first few years. Even after our precious first memory, the recollections tend to be few and far between until well into our childhood. How come?
This gaping hole in the record of our lives has been frustrating parents and baffling psychologists, neuroscientists and linguists for decades. It was a minor obsession of the father of psychotherapy, Sigmund Freud, who coined the phrase ‘infant amnesia’ over 100 years ago.
Probing that mental blank throws up some intriguing questions. Did your earliest memories actually happen, or are they simply made up? Can we remember events without the words to describe them? And might it one day be possible to claim your missing memories back?
UCLA astronomers have made the first accurate measurement of the abundance of oxygen in a distant galaxy. Oxygen, the third-most abundant chemical element in the universe, is created inside stars and released into interstellar gas when stars die. Quantifying the amount of oxygen is key to understanding how matter cycles in and out of galaxies.
This research is published online in the Astrophysical Journal Letters, and is based on data collected at the W. M. Keck Observatory on Mauna Kea, in Hawaii.
"This is by far the most distant galaxy for which the oxygen abundance has actually been measured," said Alice Shapley, a UCLA professor of astronomy, and co-author of the study. "We're looking back in time at this galaxy as it appeared 12 billion years ago."
Knowing the abundance of oxygen in the galaxy called COSMOS-1908 is an important stepping stone toward allowing astronomers to better understand the population of faint, distant galaxies observed when the universe was only a few billion years old and galaxy evolution, Shapley said.
July 2016 was the warmest July in 136 years of modern record-keeping, according to a monthly analysis of global temperatures by scientists at NASA's Goddard Institute for Space Studies (GISS) in New York.
Because the seasonal temperature cycle peaks in July, it means July 2016 also was warmer than any other month on record. July 2016's temperature was a statistically small 0.1 degrees Celsius warmer than previous warm Julys in 2015, 2011 and 2009.
“It wasn't by the widest of margins, but July 2016 was the warmest month since modern record keeping began in 1880,” said GISS Director Gavin Schmidt. “It appears almost a certainty that 2016 also will be the warmest year on record.”
The record warm July continued a streak of 10 consecutive months dating back to October 2015 that have set new monthly high-temperature records. Compared to previous years, the warmer global temperatures last month were most pronounced in the northern hemisphere, particularly near the Arctic region.
The monthly analysis by the GISS team is assembled from publicly available data acquired by about 6,300 meteorological stations around the world, ship- and buoy-based instruments measuring sea surface temperature, and Antarctic research stations. The modern global temperature record begins around 1880 because previous observations didn't cover enough of the planet.
Psychedelic pictures of 30 galactic collisions show for the first time that merging galaxies often spawn disc-shaped offspring like our Milky Way.
These images show the carbon monoxide gas detected in neighboring galaxies – 40 to 600 million light years from Earth – in their final stages of merging. The colors show how this gas was moving: blue represents gas that is moving towards us, while red indicates gas that is moving away.
Out of 37 galaxies observed, these 30 all show gas rotating around the centre of the galaxy, meaning they are disc galaxies in the making.
“For the first time there is observational evidence for merging galaxies that could result in disc galaxies. This is a large and unexpected step towards understanding the mystery of the birth of disc galaxies,” says Junko Ueda from the Japan Society for the Promotion of Science. Ueda and her team made the observations using data from the ALMA radio telescope.
The term "life hacking" usually refers to clever tweaks that make your life more productive. But this week in Science, a team of scientists comes a step closer to the literal meaning: hacking the machinery of life itself. They have designed—though not completely assembled—a synthetic Escherichia coli genome that could use a protein-coding scheme different from the one employed by all known life. Requiring a staggering 62,000 DNA changes, the finished genome would be the most complicated genetic engineering feat so far. E. coli running this rewritten genome could become a new workhorse for laboratory experiments and a factory for new industrial chemicals, its creators predict.
Such a large-scale genomic hack once seemed impossible, but no longer, says Peter Carr, a bioengineer at the Massachusetts Institute of Technology Lincoln Laboratory in Lexington who is not involved with the project. "It's not easy, but we can engineer life at profound scales, even something as fundamental as the genetic code."
The genome hacking is underway in the lab of George Church at Harvard University, the DNA-sequencing pioneer who has become the most high-profile, and at times controversial, name in synthetic biology. The work takes advantage of the redundancy of life's genetic code, the language that DNA uses to instruct the cell's protein-synthesizing machinery. To produce proteins, cells "read" DNA's four-letter alphabet in clusters of three called codons. The 64 possible triplets are more than enough to encode the 20 amino acids that exist in nature, as well as the "stop" codons that mark the ends of genes. As a result, the genetic code has multiple codons for the same amino acid: the codons CCC and CCG both encode the amino acid proline, for example.
Church and others hypothesized that redundant codons could be eliminated—by swapping out every CCC for a CCG in every gene, for instance—without harming the cell. The gene that enables CCC to be translated into proline could then be deleted entirely. "There are a number of 'killer apps'" of such a "recoded" cell, says Farren Isaacs, a bioengineer at Yale University, who, with Church and colleagues, showed a stop codon can be swapped out entirely from E. coli.
The cells could be immune to viruses that impair bioreactors, for example, if crucial viral genes include now untranslatable codons. The changes could also allow synthetic biologists to repurpose the freed redundant codons for an entirely different function, such as coding for a new, synthetic amino acid.
For this study, Church's team decided to eliminate seven of the microbe's 64 codons. That target seemed like "a good balance" between the number of changes that appeared technically achievable and the number that might be too many for a cell to survive, says Matthieu Landon, one of Church's Ph.D. students. And the seven spare codons could eventually be repurposed to code up to four different unnatural amino acids.
But making so many changes, even with the latest DNA editing techniques such as CRISPR, still appeared impossible. Luckily, the cost of synthesizing DNA has plummeted over the past decade. So instead of editing the genome one site at a time, Church's team used machines to synthesize long stretches of the recoded genome from scratch, each chunk containing multiple changes.
The team has now turned to the laborious job of inserting these chunks into E. coli one by one and making sure that none of the genomic changes is lethal to the cells. The researchers have only tested 63% of the recoded genes so far, but remarkably few of the changes have caused trouble, they say.
Dr. Mansoor Sheik-Bahae, professor of physics and astronomy, along with his research group, are advancing a technique called optical refrigeration to reach cryogenic temperature. Essentially, the group is using laser light to chill a special type of crystal, which can then be attached to a device that requires constant and reliable cooling, like infrared detectors on satellites. What sets their technique apart is the temperatures it can cool to without having any moving parts.
“Right now, anything that cools other parts of a system has moving parts. Most of the time, there’s liquid running through it that adds vibrations which can impact the precision or resolution of the device,” explained Aram Gragossian, a research assistant in Sheik-Bahae’s lab. “But, when you have optical refrigeration, you can go to low temperatures without any vibrations and without any moving parts, making it convenient for a lot of applications.
Earlier this year, Sheik-Bahae, along with collaborators at UNM, and Los Alamos National Labs, reached the lowest temperatures ever recorded using an all-solid-state cryocooler – 91 kelvin or -296o Fahrenheit – temperatures that were previously only able to be reached using liquid nitrogen or helium. The research, Solid-state optical refrigeration to sub-100 Kelvin regime, was published in the journal of (Nature) Scientific Reports 6.
“Here at UNM, we are the only group in the world that’s been able to cool to cryogenic temperatures with an all-solid-state optical cryocooler,” said Alexander Albrecht, one of the paper’s co-authors and research assistant professor at UNM.
New research from the Harvard-Smithsonian Center for Astrophysics reveals that the Venus-like Exoplanet GJ 1132b might possess a thin, oxygen atmosphere – but no life due to its extreme heat.
The distant planet GJ 1132b intrigued astronomers when it was discovered last year. Located just 39 light-years from Earth, it might have an atmosphere despite being baked to a temperature of around 450 degrees Fahrenheit. But would that atmosphere be thick and soupy or thin and wispy? New research suggests the latter is much more likely.
Harvard astronomer Laura Schaefer (Harvard-Smithsonian Center for Astrophysics, or CfA) and her colleagues examined the question of what would happen to GJ 1132b over time if it began with a steamy, water-rich atmosphere.
Orbiting so close to its star, at a distance of just 1.4 million miles, the planet is flooded with ultraviolet or UV light. UV light breaks apart water molecules into hydrogen and oxygen, both of which then can be lost into space. However, since hydrogen is lighter it escapes more readily, while oxygen lingers behind.
“On cooler planets, oxygen could be a sign of alien life and habitability. But on a hot planet like GJ 1132b, it’s a sign of the exact opposite – a planet that’s being baked and sterilized,” said Schaefer.
Since water vapor is a greenhouse gas, the planet would have a strong greenhouse effect, amplifying the star’s already intense heat. As a result, its surface could stay molten for millions of years.
A “magma ocean” would interact with the atmosphere, absorbing some of the oxygen, but how much? Only about one-tenth, according to the model created by Schaefer and her colleagues. Most of the remaining 90 percent of leftover oxygen streams off into space, however some might linger.
“This planet might be the first time we detect oxygen on a rocky planet outside the solar system,” said co-author Robin Wordsworth (Harvard Paulson School of Engineering and Applied Sciences). If any oxygen does still cling to GJ 1132b, next-generation telescopes like the Giant Magellan Telescope and James Webb Space Telescope may be able to detect and analyze it.
The magma ocean-atmosphere model could help scientists solve the puzzle of how Venus evolved over time. Venus probably began with Earthlike amounts of water, which would have been broken apart by sunlight. Yet it shows few signs of lingering oxygen. The missing oxygen problem continues to baffle astronomers.
Schaefer predicts that their model also will provide insights into other, similar exoplanets. For example, the system TRAPPIST-1 contains three planets that may lie in the habitable zone. Since they are cooler than GJ 1132b, they have a better chance of retaining an atmosphere.
Polish astronomers, based in Chile, were performing a sky survey that intended the detection of dark matter. But they obtained much more, discovering the process of a classical nova explosion. Results are published in the science journal Nature as researchers collected images for the “Optical Gravitational Lensing Experiment“, discovering they had a timeline that captured the rare event.
Since the 1986 discovery of high-temperature superconductivity in copper-oxide compounds called cuprates, scientists have been trying to understand how these materials can conduct electricity without resistance at temperatures hundreds of degrees above the ultra-chilled temperatures required by conventional superconductors. Finding the mechanism behind this exotic behavior may pave the way for engineering materials that become superconducting at room temperature. Such a capability could enable lossless power grids, more affordable magnetically levitated transit systems, and powerful supercomputers, and change the way energy is produced, transmitted, and used globally.
Now, physicists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory have an explanation for why the temperature at which cuprates become superconducting is so high. After growing and analyzing thousands of samples of a cuprate known as LSCO for the four elements it contains (lanthanum, strontium, copper, and oxygen), they determined that this "critical" temperature is controlled by the density of electron pairs—the number of electron pairs per unit area. This finding, described in a Nature paper published August 17, challenges the standard theory of superconductivity, which proposes that the critical temperature depends instead on the strength of the electron pairing interaction.
"Solving the enigma of high-temperature superconductivity has been the focus of condensed matter physics for more than 30 years," said Ivan Bozovic, a senior physicist in Brookhaven Lab's Condensed Matter Physics and Materials Science Department who led the study. "Our experimental finding provides a basis for explaining the origin of high-temperature superconductivity in the cuprates—a basis that calls for an entirely new theoretical framework."
According to Bozovic, one of the reasons cuprates have been so difficult to study is because of the precise engineering required to generate perfect crystallographic samples that contain only the high-temperature superconducting phase.
"It is a materials science problem. Cuprates can have up to 50 atoms per unit cell and the elements can form hundreds of different compounds, likely resulting in a mixture of different phases," said Bozovic.
That's why Bozovic and his research team grew their more than 2,500 LSCO samples by using a custom-designed molecular beam epitaxy system that places single atoms onto a substrate, layer by layer. This system is equipped with advanced surface-science tools, such as those for absorption spectroscopy and electron diffraction, that provide real-time information about the surface morphology, thickness, chemical composition, and crystal structure of the resulting thin films.
There are four fundamental forces that a dark matter particle could interact with. There is the strong force that binds together the atomic nucleus; the weak force which governs the decay of particles such as radioactivity; an electromagnetic force that mediates the force between charged particles; and the gravitational force which governs gravitational interaction. To observe matter in space we need it to interact via the electromagnetic force, as this involves the release of light or other electromagnetic radiation that a telescope can register.
Here are the five candidates for particles that I think have the best chance.
1. The WIMP
The weakly interacting massive particle, or WIMP, is a hypothetical particle that looks promising. It would be completely different from the type of matter we know and would interact via the electromagnetic force, which would explain why they are largely invisible in space. Roughly 100,000 of these would pass through every square centimeter of the Earth each second, interacting only via the weak force and gravity with surrounding matter. WIMPs have been the subject of a lot of extensive research, especially beyond the Standard Model of physics, which independently predicted that such a particle must exist – a coincidence dubbed the "WIMP miracle".
2. The Axion
Axions are low-mass, slow-moving particles that don't have a charge and only interact weakly with other matter which makes them difficult – but but not impossible – to detect. Only axions of a specific mass would be able to explain the invisible nature of dark matter – if they are any lighter or heavier we would be able to see them. And if axions do exist they would be able to decay into a pair of light particle (photons), which means we could detect them by looking for such pairs. Experiments including the Axion Dark Matter Experiment is currently looking for axions in this way.
3. The MACHO
MACHO stands for "massive astrophysical compact halo object" and was one of the first proposed candidates for dark matter. These objects, including neutron stars, and brown and white dwarfs, are composed of ordinary matter. So how could they be invisible? The reason is that they emit very little to no light.
One way to observe them is by monitoring the brightness of distant stars. As light rays bend when they pass close to a massive object, light from a distant source may be focused by a closer object to produce a sudden brightening of the distant object. This effect, known as gravitational lensing, depends on how much matter, both normal and dark, is in a galaxy – we can use it to calculate the amount of matter lurking around. However, we now know it is unlikely that enough of these dark bodies could accumulate to make up the vast amount of dark matter that exists.
4. The Kaluza-Klein particle
The Kaluza-Klein theory is built around the existence of an invisible "fifth dimension" curled up in space, in addition to the three spatial dimensions we know (height, width, depth), and time. This theory, a precursor to string theory, predicts the existence of a particle that could be a dark matter particle, which would have the same mass as 550 to 650 protons (these make up the atomic nucleus together with neutrons).
This kind of particle could interact both via electromagnetism and gravity. However, as it is curled up in a dimension we can't see, we wouldn't observe it by just by looking at the sky. Luckily, the particle should be is easy to look for in experiments as it should decay into particles we can measure – into neutrinos and photons. However, powerful particle accelerators like the Large Hadron Collider are yet to detect it.
5. The Gravitino
Theories combining general relativity and "supersymmetry" predict the existence of a particle called the gravitino. Supersymmetry, which is a successful theory explaining a lot of observations in physics, states that all "boson" particles – such as the photon (light particle)– have a "superpartner", the photino, with a property called "spin" (a type of angular momentum) that differs by a half-integer. The gravitino would be the superpartner of the hypothetical "graviton", thought to mediate the force of gravitation. And in some models of supergravity where the gravitino is very light, it could account for dark matter.
More than one million people have now had their genome sequenced, or its protein-coding regions (the exome). The hope is that this information can be shared and linked to phenotype — specifically, disease — and improve medical care. An obstacle is that only a small fraction of these data are publicly available:
There are challenges in sharing such data sets — the project scientists deserve credit for making this one open access. Its scale offers insight into rare genetic variation across populations. It identifies more than 7.4 million (mostly new) variants at high confidence, and documents rare mutations that independently emerged, providing the first estimate of the frequency of their recurrence. And it finds 3,230 genes that show nearly no cases of loss of function. More than two-thirds have not been linked to disease, which points to how much we have yet to understand.
The study also raises concern about how genetic variants have been linked to rare disease. The average ExAC participant has some 54 variants previously classified as causal for a rare disorder; many show up at an implausibly high frequency, suggesting that they were incorrectly classified. The authors review evidence for 192 variants reported earlier to cause rare Mendelian disorders and found at a high frequency by ExAC, and uncover support for pathogenicity for only 9. The implications are broad: these variant data already guide diagnoses and treatment (see, E. V. Minikel et al. Sci. Transl. Med. 8, 322ra9; 2016 and R. Walsh et al. Genet. Med. http://dx.doi.org/10.1038/gim.2016.90; 2016).
These findings show that researchers and clinicians must carefully evaluate published results on rare genetic disorders. And it demonstrates the need to filter variants seen in sequence data, using the ExAC data set and other reference tools — a practice widely adopted in genomics.
The ExAC project plans to grow over the next year to include 120,000 exome and 20,000 whole-genome sequences. It relies on the willingness of large research consortia to cooperate, and highlights the huge value of sharing, aggregation and harmonization of genomic data. This is also true for patient variants — there is a need for databases that provide greater confidence in variant interpretation, such as the US National Center for Biotechnology Information’s ClinVar database.
Sharing your scoops to your social media accounts is a must to distribute your curated content. Not only will it drive traffic and leads through your content, but it will help show your expertise with your followers.
How to integrate my topics' content to my website?
Integrating your curated content to your website or blog will allow you to increase your website visitors’ engagement, boost SEO and acquire new visitors. By redirecting your social media traffic to your website, Scoop.it will also help you generate more qualified traffic and leads from your curation work.
Distributing your curated content through a newsletter is a great way to nurture and engage your email subscribers will developing your traffic and visibility.
Creating engaging newsletters with your curated content is really easy.