The microscopic tardigrade—also known as the water bear—is the only animal that can survive the cold, irradiated vacuum of outer space. We talked to leading tardigrade researchers to find out what makes these little guys so amazing.
The ability to break down alcohol likely helped human ancestorsmake the most out of rotting, fermented fruit that fell onto the forest floor, the researchers said. Therefore, knowing when this ability developed could help researchers figure out when these human ancestors began moving to life on the ground, as opposed to mostly in trees, as earlier human ancestors had lived. "A lot of aspects about the modern human condition — everything from back pain to ingesting too much salt, sugar and fat — goes back to our evolutionary history," said lead study author Matthew Carrigan, a paleogeneticist at Santa Fe College in Gainesville, Florida. "We wanted to understand more about the modern human condition with regards to ethanol," he said, referring to the kind of alcohol found in rotting fruit and that's also used in liquor and fuel.
To learn more about how human ancestors evolved the ability to break down alcohol, scientists focused on the genes that code for a group of digestive enzymes called the ADH4 family. ADH4 enzymes are found in the stomach, throat and tongue of primates, and are the first alcohol-metabolizing enzymes to encounter ethanol after it is imbibed. The researchers investigated the ADH4 genes from 28 different mammals, including 17 primates. They collected the sequences of these genes from either genetic databanks or well-preserved tissue samples.
The scientists looked at the family trees of these 28 species, to investigate how closely related they were and find out when their ancestors diverged. In total, they explored nearly 70 million years of primate evolution. The scientists then used this knowledge to investigate how the ADH4 genes evolved over time and what the ADH4 genes of their ancestors might have been like.
Then, Carrigan and his colleagues took the genes for ADH4 from these 28 species, as well as the ancestral genes they modeled, and plugged them into bacteria, which read the genes and manufactured the ADH4 enzymes. Next, they tested how well those enzymes broke down ethanol and other alcohols. This method of using bacteria to read ancestral genes is "a new way to observe changes that happened a long time ago that didn't fossilize into bones," Carrigan said.
The results suggested there was a single genetic mutation 10 million years ago that endowed human ancestors with an enhanced ability to break down ethanol. "I remember seeing this huge difference in effects with this mutation and being really surprised," Carrigan said. The scientists noted that the timing of this mutation coincided with a shift to a terrestrial lifestyle. The ability to consume ethanol may have helped human ancestors dine on rotting, fermenting fruit that fell on the forest floor when other food was scarce.
Made famous in the movie Finding Nemo, a sea devil is caught on film for the first time. The anglerfish survived capture and is now being studied in a specially equipped laboratory.
With its gaping mouth, needle-sharp teeth, and slightly startled expression, the black sea devil anglerfish seems tailor-made for the spotlight. And in fact, one particular female got her close-up on November 17 when researchers got footage of this rare anglerfish—the first time this species has been filmed alive and in its natural habitat—off of central California.
A team using a remotely operated vehicle (ROV) in the Monterey Bay Canyon spied this 3.5-inch-long (9 centimeter) black sea devil about 1,900 feet (580 meters) deep. The scientists were then able to bring her up to the surface alive—no mean feat—and have been monitoring the fish ever since. Bruce Robison, a deep-sea ecologist at the Monterey Bay Aquarium Research Institute, has brought up sea devils from the deep before, but never with an ROV. "It came up in absolutely perfect condition," he says.
Having a living animal to study is telling scientists so much more than they could ever have gotten from the dead, preserved specimens floating around various research facilities, Robison explains. "One of the first things that we got back from ichthyologists was astonishment at how the fish uses its dorsal fin to swim," he says. "Nobody had ever seen that." The anglerfish also appeared to be breathing more than expected, given its build, Robison added.
Testing the multiverse hypothesis requires measuring whether our universe is statistically typical among the infinite variety of universes. But if modern physics is to be believed, we shouldn’t be here. The meager dose of energy infusing empty space, which at higher levels would rip the cosmos apart, is a trillion trillion trillion trillion trillion trillion trillion trillion trillion trillion times tinier than theory predicts. And the minuscule mass of the Higgs boson, whose relative smallness allows big structures such as galaxies and humans to form, falls roughly 100 quadrillion times short of expectations. Dialing up either of these constants even a little would render the universe unlivable.
To account for our incredible luck, leading cosmologists like Alan Guth and Stephen Hawking envision our universe as one of countless bubbles in an eternally frothing sea. This infinite “multiverse” would contain universes with constants tuned to any and all possible values, including some outliers, like ours, that have just the right properties to support life. In this scenario, our good luck is inevitable: A peculiar, life-friendly bubble is all we could expect to observe.
Many physicists loathe the multiverse hypothesis, deeming it a cop-out of infinite proportions. But as attempts to paint our universe as an inevitable, self-contained structure falter, the multiverse camp is growing. The problem remains how to test the hypothesis. Proponents of the multiverse idea must show that, among the rare universes that support life, ours is statistically typical. The exact dose of vacuum energy, the precise mass of our underweight Higgs boson, and other anomalies must have high odds within the subset of habitable universes. If the properties of this universe still seem atypical even in the habitable subset, then the multiverse explanation fails.
The multiverse hypothesis gained considerable traction in 1987, when the Nobel laureate Steven Weinberg used it to predict the infinitesimal amount of energy infusing the vacuum of empty space, a number known as the cosmological constant, denoted by the Greek letter Λ (lambda). Vacuum energy is gravitationally repulsive, meaning it causes space-time to stretch apart. Consequently, a universe with a positive value for Λ expands — faster and faster, in fact, as the amount of empty space grows — toward a future as a matter-free void. Universes with negative Λ eventually contract in a “big crunch.”
Weinberg turned to a concept called anthropic selection in response to “the continued failure to find a microscopic explanation of the smallness of the cosmological constant,” as he wrote in Physical Review Letters (PRL). He posited that life forms, from which observers of universes are drawn, require the existence of galaxies. The only values of Λ that can be observed are therefore those that allow the universe to expand slowly enough for matter to clump together into galaxies. In his PRL paper, Weinberg reported the maximum possible value of Λ in a universe that has galaxies. It was a multiverse-generated prediction of the most likely density of vacuum energy to be observed, given that observers must exist to observe it.
A decade later, astronomers discovered that the expansion of the cosmos was accelerating at a rate that pegged Λ at 10−123 (in units of “Planck energy density”). A value of exactly zero might have implied an unknown symmetry in the laws of quantum mechanics — an explanation without a multiverse. But this absurdly tiny value of the cosmological constant appeared random. And it fell strikingly close to Weinberg’s prediction.
The infinite multiverse can be divided into finite regions called causal diamonds that range from large and rare with many observers (left) to small and common with few observers (right). In this scenario, causal diamonds like ours should be large enough to give rise to many observers but small enough to be relatively common. The causal-diamond measure has now racked up a number of successes. It offers a solution to a mystery of cosmology called the “why now?” problem, which asks why we happen to live at a time when the effects of matter and vacuum energy are comparable, so that the expansion of the universe recently switched from slowing down (signifying a matter-dominated epoch) to speeding up (a vacuum energy-dominated epoch).
The causal-diamond measure falls short in a few ways, however. It does not gauge the probabilities of universes with negative values of the cosmological constant. And its predictions depend sensitively on assumptions about the early universe, at the inception of the future-pointing light cone. But researchers in the field recognize its promise. By sidestepping the infinities underlying the measure problem, the causal diamond “is an oasis of finitude into which we can sink our teeth,” said Andreas Albrecht, a theoretical physicist at the University of California, Davis, and one of the early architects of inflation.
During a thunderstorm, we all know that it is common to hear thunder after we see the lightning. That's because sound travels much slower (768 miles per hour) than light (670,000,000 miles per hour).
Now, University of Minnesota engineering researchers have developed a chip on which both sound wave and light wave are generated and confined together so that the sound can very efficiently control the light. The novel device platform could improve wireless communications systems using optical fibers and ultimately be used for computation using quantum physics. The research was recently published in Nature Communications. The University of Minnesota chip is made with a silicon base coated with a layer of aluminum nitride that conducts an electric change. Applying alternating electrical signal to the material causes the material to deform periodically and generate sound waves that grow on its surface, similar to earthquake waves that grow from the center of the earthquake. The technology has been widely used in cell phones and other wireless devices as microwave filters.
"Our breakthrough is to integrate optical circuits in the same layer of material with acoustic devices in order to attain extreme strong interaction between light and sound waves," said Mo Li, assistant professor in the Department of Electrical and Computer Engineering and the lead researcher of the study.
The researchers used the state-of-the-art nanofabrication technology to make arrays of electrodes with a width of only 100 nanometers (0.00001 centimeters) to excite sound waves at an unprecedented high frequency that is higher than 10 GHz, the frequency used for satellite communications.
"What's remarkable is that at this high frequency, the wavelength of the sound is even shorter than the wavelength of light. This is achieved for the first time on a chip," said Semere Tadesse, a graduate student in the University of Minnesota's School of Physics and Astronomy and the first author of the paper. "In this unprecedented regime, sound can interact with light most efficiently to achieve high-speed modulation."
For the first time, a team of astronomers - including York University Professor Ray Jayawardhana - have measured the passing of a super-Earth in front of a bright, nearby Sun-like star using a ground-based telescope. The transit of the exoplanet 55 Cancri e is the shallowest detected from the ground yet, and the success bodes well for characterizing the many small planets that upcoming space missions are expected to discover in the next few years.
The international research team used the 2.5-meter Nordic Optical Telescope on the island of La Palma, Spain - a moderate-sized facility by today's standards - to make the detection. Previous observations of this planet transit had to rely on space-borne telescopes. During its transit, the planet crosses its host star, 55 Cancri, located just 40 light-years away from us and visible to the naked eye, blocking a tiny fraction of the starlight, dimming the star by 1/2000th (or 0.05%) for almost two hours.
"Our observations show that we can detect the transits of small planets around Sun-like stars using ground-based telescopes," says Dr. Ernst de Mooij, of Queen's University Belfast, UK, the study's lead author. "This is especially important because upcoming space missions such as TESS and PLATO should find many small planets around bright stars." TESS is a NASA mission scheduled for launch in 2017, while PLATO is to be launched in 2024 by the European Space Agency; both will search for transiting terrestrial planets around nearby bright stars.
"It's remarkable what we can do by pushing the limits of existing telescopes and instruments, despite the complications posed by the Earth's own turbulent atmosphere," says Jayawardhana, the study's co-author and de Mooij's former postdoctoral supervisor. "Observations like these are paving the way as we strive towards searching for signs of life on alien planets from afar. Remote sensing across tens of light-years isn't easy, but it can be done with the right technique and a bit of ingenuity."
The first ten months of 2014 have been the hottest since record keeping began more than 130 years ago, according to data from the National Oceanic and Atmospheric Administration. That may be hard to believe for people in places like Buffalo, New York, which saw a record early snowfall this year.
But NOAA says, despite the early bitter cold across parts of the United States in recent weeks, it's been a hot year so far for the Earth. With two months left on the calendar, 2014 is shaping up to be the hottest year on record. The average global temperature between January and October has been 0.68 degrees Celsius (1.22 degrees Fahrenheit) higher than the 20th century's average global temperature of 14.1 C (57.4 F).
NOAA's analysis is an important "health gauge" indicating an ominous trend for the planet, says CNN meteorologist Derek Van Dam.
"It's becoming increasingly more difficult to be a skeptic of the causes of our warming planet," he says. This October was the hottest October on record globally, NOAA data showed. The mercury climbed more than one degree Fahrenheit above the 20th century average of 57.1 F. It was the fourth warmest October on record for the United States, NOAA said. "The record high October temperature was driven by warmth across the globe over both the land and ocean surfaces and was fairly evenly distributed between the Northern and Southern Hemispheres," the agency said.
NOAA's analysis breaks down global temperatures into two categories -- land and ocean -- then an average that includes both. The record high temperatures in October were recorded across both land and sea. The surface temperature on land approached an important scientific benchmark. It was almost 2 degrees Celsius higher than the 20th century average for October of 9.3 C (48.7 F). According to the non-binding international agreement on climate change -- the Copenhagen Accord, reached in 2009 -- any temperature increase above the 2 degree Celsius mark is "dangerous."
In a quantum network, information is stored, processed, and transmitted from one device to another in the form of quantum states. The quantum nature of the network gives it certain advantages over classical networks, such as greater security.
One promising method for implementing a quantum network involves using both atoms and photons for their unique advantages. While atoms are useful as nodes (in the form of quantum memories and processors) due to their long storage times, photons are useful as links (on optical fibers) because they're better at carrying quantum information over large distances.
However, using both atoms and photons requires that quantum states be converted between single atoms and single photons. This in turn requires a high degree of control over the emission and absorption processes in which single atoms act as senders and receivers of single photons. Because it's difficult to achieve complete overlap between the atomic and photonic modes, photon-to-atom state transfer usually suffers from low fidelities of below 10%. This means that more than 90% of the time the state transfer is unsuccessful.
In a new paper published in Nature Communications, a team of researchers led by Jürgen Eschner, Professor at Saarland University in Saarbrucken, Germany, has experimentally demonstrated photon-to-atom quantum state transfer with a fidelity of more than 95%. This drastic improvement marks an important step toward realizing future large-scale quantum networks. The researchers' protocol consists of transferring the polarization state of a laser photon onto the ground state of a trapped calcium ion. To do this, the researchers prepared the calcium ion in a quantum superposition state, in which it simultaneously occupies two atomic levels. When the ion absorbs a photon emitted by a laser at an 854-nm wavelength, the photon's polarization state gets mapped onto the ion. Upon absorbing the photon, the ion returns to its ground state and emits a single photon at a 393-nm wavelength. Detection of this 393-nm photon signifies a successful photon-to-atom quantum state transfer.
he researchers showed that this method achieves very high fidelities of 95-97% using a variety of atomic states and both linear and circular polarizations. The method also has a relatively high efficiency of 0.438%. The researchers explain that the large fidelity improvement is due in large part to the last step involving the detection of the 393-nm photon.