Amazing Science
759.3K views | +188 today
Amazing Science
Amazing science facts - 3D_printing • aging • AI • anthropology • art • astronomy • bigdata • bioinformatics • biology • biotech • chemistry • computers • cosmology • education • environment • evolution • future • genetics • genomics • geosciences • green_energy • history • language • map • material_science • math • med • medicine • microscopy • nanotech • neuroscience • paleontology • photography • photonics • physics • postings • robotics • science • technology • video
Your new post is loading...
Rescooped by Dr. Stefan Gruenwald from Fragments of Science!

Machine learning kept unstable quantum bits in line – even before they wavered

Machine learning kept unstable quantum bits in line – even before they wavered | Amazing Science |

Imagine predicting your car will break down and being able to replace the faulty part before it becomes a problem. Now Australian physicists have found a way to do this – albeit on a quantum scale.


In Nature Communications, they enlisted machine learning to “foresee” the future failure of a quantum bit, or qubit, and makes the necessary corrections to stop it happening.


Quantum computing is a potentially world-changing technology with the potential to complete tasks in minutes what current computers take thousands of years. But achieving a practical, large-scale quantum technologies is probably a long way off.


One of the major challenges is maintaining qubits in the delicate, zen-like state of superposition they need to do their business.

Any tiny nudge from the environment – such as the jiggly atom next door – knocks the qubit off balance.


So physicists go to great lengths to stabilize qubits, cooling them to more than 200 degrees below zero to reduce atomic jiggling. Still, superposition typically lasts but a tiny fraction of a second, and this cuts quantum number-crunching time short.


A team led by Michael Biercuk at the University of Sydney found a new way of stabilizing qubits against noise in the environment. It works by predicting how a qubit will behave and act preemptively. In a quantum computer, the technique could make qubits twice as stable as before. The team used control theory and machine learning (a kind of artificial intelligence) to estimate how the future of a qubit would play out.


Control theory is the branch of engineering that deals with feedback systems, such as the thermostat keeping your room temperature constant. The thermostat reacts to changes in the environment, initiating warm or cool air to pump into the room. Meanwhile, new machine learning algorithms look at how the system behaved in the past and use this information predict how it will react to future events.


First, Biercuk’s team made a qubit by trapping a single ion of ytterbium in a beam of laser light. To train their algorithm, they simulated noise, tweaking the light to disturb the atom in a controlled way. Their algorithm monitored how the qubit responded to these tweaks and made a prediction for how it would behave in future. Next, they let events play out for the qubit to check their algorithm’s accuracy. The longer the algorithm watched the qubit, the more accurate its predictions became. Finally, the team used the predictions to help the system self-correct. The qubit was twice as stable with the algorithm as without it.

Via Mariaschnee
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Neutrons and a 'bit of gold' uncover new type of quantum phase transition

Neutrons and a 'bit of gold' uncover new type of quantum phase transition | Amazing Science |

When matter changes from solids to liquids to vapors, the changes are called phase transitions. Among the most interesting types are more exotic changes—quantum phase transitions—where the strange properties of quantum mechanics can bring about extraordinary changes in curious ways.


In a paper published in Physical Review Letters, a team of researchers led by the Department of Energy's Oak Ridge National Laboratory reports the discovery of a new type of quantum phase transition. This unique transition happens at an elastic quantum critical point, or QCP, where the phase transition isn't driven by thermal energy but instead by the quantum fluctuations of the atoms themselves.


The researchers used a combination of neutron and X-ray diffraction techniques, along with heat capacity measurements, to reveal how an elastic QCP can be found in a lanthanum-copper material by simply adding a little bit of gold.


Phase transitions associated with QCPs happen at near absolute zero temperature (about minus 460 degrees Fahrenheit), and are typically driven at that temperature via factors such as pressure, magnetic fields, or by substituting additional chemicals or elements in the material.


"We study QCPs because materials exhibit many strange and exciting behaviors near the zero temperature phase transition that can't be explained by classical physics," said lead author Lekh Poudel, a University of Tennessee graduate student working in ORNL's Quantum Condensed Matter Division. "Our goal was to explore the possibility of a new type of QCP where the quantum motion alters the arrangement of atoms.


"Its existence had been theoretically predicted, but there hadn't been any experimental proof until now," he said. "We're the first to establish that the elastic QCP does exist."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Physicists measure the loss of dark matter since the birth of the universe

Physicists measure the loss of dark matter since the birth of the universe | Amazing Science |

Russian scientists have discovered that the proportion of unstable particles in the composition of dark matter in the days immediately following the Big Bang was no more than 2 percent to 5 percent.


"The discrepancy between the cosmological parameters in the modern universe and the universe shortly after the Big Bang can be explained by the fact that the proportion of dark matter has decreased. We have now, for the first time, been able to calculate how much dark matter could have been lost, and what the corresponding size of the unstable component would be," says co-author Igor Tkachev of the Department of Experimental Physics at INR.


Astronomers first suspected that there was a large proportion of hidden mass in the universe back in the 1930s, when Fritz Zwicky discovered "peculiarities" in a cluster of galaxies in the constellation Coma Berenices—the galaxies moved as if they were under the effect of gravity from an unseen source. This hidden mass, which is only deduced from its gravitational effect, was given the name dark matter. According to data from the Planck space telescope, the proportion of dark matter in the universe is 26.8 percent; the rest is "ordinary" matter (4.9 percent) and dark energy (68.3 percent).


The nature of dark matter remains unknown. However, its properties could potentially help scientists to solve a problem that arose after studying observations from the Planck telescope. This device accurately measured the fluctuations in the temperature of the cosmic microwave background radiation—the "echo" of the Big Bang. By measuring these fluctuations, the researchers were able to calculate key cosmological parameters using observations of the universe in the recombination era—approximately 300,000 years after the Big Bang.


However, when researchers directly measured the speed of the expansion of galaxies in the modern universe, it turned out that some of these parameters varied significantly—namely the Hubble parameter, which describes the rate of expansion of the universe, and also the parameter associated with the number of galaxies in clusters. "This variance was significantly more than margins of error and systematic errors known to us. Therefore, we are either dealing with some kind of unknown error, or the composition of the ancient universe is considerably different to the modern universe," says Tkachev.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

CERN experiment observes the light spectrum of antimatter for the first time

CERN experiment observes the light spectrum of antimatter for the first time | Amazing Science |

In a paper published today in Nature, the ALPHA collaboration reports the first ever measurement on the optical spectrum of an antimatter atom. This achievement features technological developments that open up a completely new era in high-precision antimatter research. It is the result of over 20 years of work by the CERN antimatter community.


"Using a laser to observe a transition in antihydrogen and comparing it to hydrogen to see if they obey the same laws of physics has always been a key goal of antimatter research," said Jeffrey Hangst, Spokesperson of the ALPHA collaboration.

Atoms consist of electrons orbiting a nucleus. When the electrons move from one orbit to another they absorb or emit light at specific wavelengths, forming the atom's spectrum. Each element has a unique spectrum. As a result, spectroscopy is a commonly used tool in many areas of physics, astronomy and chemistry. It helps to characterise atoms and molecules and their internal states. For example, in astrophysics, analysing the light spectrum of remote stars allows scientists to determine their composition.


With its single proton and single electron, hydrogen is the most abundant, simple and well-understood atom in the Universe. Its spectrum has been measured to very high precision.


Antihydrogen atoms, on the other hand are poorly understood. Because the Universe appears to consist entirely of matter, the constituents of antihydrogen atoms – antiprotons and positrons – have to be produced and assembled into atoms before the antihydrogen spectrum can be measured. It’s a painstaking process, but well worth the effort since any measurable difference between the spectra of hydrogen and antihydrogen would break basic principles of physics and possibly help understand the puzzle of the matter-antimatter imbalance in the Universe.


Today’s ALPHA result is the first observation of a spectral line in an antihydrogen atom, allowing the light spectrum of matter and antimatter to be compared for the first time. Within experimental limits, the result shows no difference compared to the equivalent spectral line in hydrogen. This is consistent with the Standard Model of particle physics, the theory that best describes particles and the forces at work between them, which predicts that hydrogen and antihydrogen should have identical spectroscopic characteristics.


The ALPHA collaboration expects to improve the precision of its measurements in the future. Measuring the antihydrogen spectrum with high-precision offers an extraordinary new tool to test whether matter behaves differently from antimatter and thus to further test the robustness of the Standard Model.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

A New Spin on the Quantum Brains

A New Spin on the Quantum Brains | Amazing Science |

A new theory explains how fragile quantum states may be able to exist for hours or even days in our warm, wet brain. Experiments should soon test the idea now.


The mere mention of “quantum consciousness” makes most physicists cringe, as the phrase seems to evoke the vague, insipid musings of a New Age guru. But if a new hypothesis proves to be correct, quantum effects might indeed play some role in human cognition. Matthew Fisher, a physicist at the University of California, Santa Barbara, raised eyebrows late last year when he published a paper in Annals of Physics proposing that the nuclear spins of phosphorus atoms could serve as rudimentary “qubits” in the brain — which would essentially enable the brain to function like a quantum computer.


As recently as 10 years ago, Fisher’s hypothesis would have been dismissed by many as nonsense. Physicists have been burned by this sort of thing before, most notably in 1989, when Roger Penrose proposed that mysterious protein structures called “microtubules” played a role in human consciousness by exploiting quantum effects. Few researchers believe such a hypothesis plausible. Patricia Churchland, a neurophilosopher at the University of California, San Diego, memorably opined that one might as well invoke “pixie dust in the synapses” to explain human cognition.


Fisher’s hypothesis faces the same daunting obstacle that has plagued microtubules: a phenomenon called quantum decoherence. To build an operating quantum computer, you need to connect qubits — quantum bits of information — in a process called entanglement. But entangled qubits exist in a fragile state. They must be carefully shielded from any noise in the surrounding environment. Just one photon bumping into your qubit would be enough to make the entire system “decohere,” destroying the entanglement and wiping out the quantum properties of the system. It’s challenging enough to do quantum processing in a carefully controlled laboratory environment, never mind the warm, wet, complicated mess that is human biology, where maintaining coherence for sufficiently long periods of time is well nigh impossible.


Over the past decade, however, growing evidence suggests that certain biological systems might employ quantum mechanics. In photosynthesis, for example, quantum effects help plants turn sunlight into fuel. Scientists have also proposed that migratory birds have a “quantum compass” enabling them to exploit Earth’s magnetic fields for navigation, or that the human sense of smell could be rooted in quantum mechanics.


Fisher’s notion of quantum processing in the brain broadly fits into this emerging field of quantum biology. Call it quantum neuroscience. He has developed a complicated hypothesis, incorporating nuclear and quantum physics, organic chemistry, neuroscience and biology. While his ideas have met with plenty of justifiable skepticism, some researchers are starting to pay attention. “Those who read his paper (as I hope many will) are bound to conclude: This old guy’s not so crazy,” wrote John Preskill, a physicist at the California Institute of Technology, after Fisher gave a talk there. “He may be on to something. At least he’s raising some very interesting questions.”


Senthil Todadri, a physicist at the Massachusetts Institute of Technology and Fisher’s longtime friend and colleague, is skeptical, but he thinks that Fisher has rephrased the central question — is quantum processing happening in the brain? — in such a way that it lays out a road map to test the hypothesis rigorously. “The general assumption has been that of course there is no quantum information processing that’s possible in the brain,” Todadri said. “He makes the case that there’s precisely one loophole. So the next step is to see if that loophole can be closed.” Indeed, Fisher has begun to bring together a team to do laboratory tests to answer this question once and for all.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science!

Physicists confirm the precision of magnetic fields in the most advanced stellarator in the world

Physicists confirm the precision of magnetic fields in the most advanced stellarator in the world | Amazing Science |

Physicist Sam Lazerson of the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL) has teamed with German scientists to confirm that the Wendelstein 7-X (W7-X) fusion energy device called a stellarator in Greifswald, Germany, produces high-quality magnetic fields that are consistent with their complex design.

The findings, published in the November 30 issue of Nature Communications , revealed an error field—or deviation from the designed configuration—of less than one part in 100,000. Such results could become a key step toward verifying the feasibility of stellarators as models for future fusion reactors.


W7-X, for which PPPL is the leading U.S. collaborator, is the largest and most sophisticated stellarator in the world. Built by the Max Planck Institute for Plasma Physics in Greifswald, it was completed in 2015 as the vanguard of the stellarator design. Other collaborators on the U.S. team include DOE's Oak Ridge and Los Alamos National Laboratories, along with Auburn University, the Massachusetts Institute of Technology, the University of Wisconsin-Madison and Xanthos Technologies.


Stellarators confine the hot, charged gas, otherwise known as plasma, that fuels fusion reactions in twisty—or 3-D—magnetic fields, compared with the symmetrical—or 2D —fields that the more widely used tokamaks create. The twisty configuration enables stellarators to control the plasma with no need for the current that tokamaks must induce in the gas to complete the magnetic field. Stellarator plasmas thus run little risk of disrupting, as can happen in tokamaks, causing the internal current to abruptly halt and fusion reactions to shut down.


PPPL has played key roles in the W7-X project. The Laboratory designed and delivered five barn door-sized trim coils that fine-tune the stellarator's magnetic fields and made their measurement possible. "We've confirmed that the magnetic cage that we've built works as designed," said Lazerson, who led roughly half the experiments that validated the configuration of the field. "This reflects U.S. contributions to W7-X," he added, "and highlights PPPL's ability to conduct international collaborations." Support for this work comes from Euratom and the DOE Office of Science.

Via Mariaschnee
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science!

High-precision magnetic field sensing

High-precision magnetic field sensing | Amazing Science |
Scientists have developed a highly sensitive sensor to detect tiny changes in strong magnetic fields. The sensor may find widespread use in medicine and other areas.


Researchers from the Institute for Biomedical Engineering, which is operated jointly by ETH Zurich and the University of Zurich, have succeeded in measuring tiny changes in strong magnetic fields with unprecedented precision. In their experiments, the scientists magnetised a water droplet inside a magnetic resonance imaging (MRI) scanner, a device that is used for medical imaging. The researchers were able to detect even the tiniest variations of the magnetic field strength within the droplet. These changes were up to a trillion times smaller than the seven tesla field strength of the MRI scanner used in the experiment.


"Until now, it was possible only to measure such small variations in weak magnetic fields," says Klaas Prüssmann, Professor of Bioimaging at ETH Zurich and the University of Zurich. An example of a weak magnetic field is that of the Earth, where the field strength is just a few dozen microtesla. For fields of this kind, highly sensitive measurement methods are already able to detect variations of about a trillionth of the field strength, says Prüssmann. "Now, we have a similarly sensitive method for strong fields of more than one tesla, such as those used, inter alia, in medical imaging."


The scientists based the sensing technique on the principle of nuclear magnetic resonance, which also serves as the basis for magnetic resonance imaging and the spectroscopic methods that biologists use to elucidate the 3D structure of molecules.


However, to measure the variations, the scientists had to build a new high-precision sensor, part of which is a highly sensitive digital radio receiver. "This allowed us to reduce background noise to an extremely low level during the measurements," says Simon Gross. Gross wrote his doctoral thesis on this topic in Prüssmann's group and is lead author of the paper published in the journal Nature Communications.

Via Mariaschnee
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Toward the development of X-ray movies

Toward the development of X-ray movies | Amazing Science |
MIT researchers find a tabletop all-optical terahertz-driven electron gun could replace car-sized radio-frequency (RF) guns in electron diffraction imaging and ultrafast X-ray imaging.


Ultrashort bursts of electrons have several important applications in scientific imaging, but producing them has typically required a costly, power-hungry apparatus about the size of a car. In the journal Optica, researchers at MIT, the German Synchrotron, and the University of Hamburg in Germany describe a new technique for generating electron bursts, which could be the basis of a shoebox-sized device that consumes only a fraction as much power as its predecessors.


Ultrashort electron beams are used to directly gather information about materials that are undergoing chemical reactions or changes of physical state. But after being fired down a particle accelerator a half a mile long, they’re also used to produce ultrashort X-rays. Last year, in Nature Communications, the same group of MIT and Hamburg researchersreported the prototype of a small “linear accelerator” that could serve the same purpose as the much larger and more expensive particle accelerator. That technology, together with a higher-energy version of the new “electron gun,” could bring the imaging power of ultrashort X-ray pulses to academic and industry labs.


Indeed, while the electron bursts reported in the new paper have a duration measured in hundreds of femtoseconds, or quadrillionths of a second (which is about what the best existing electron guns can manage), the researchers’ approach has the potential to lower their duration to a single femtosecond. An electron burst of a single femtosecond could generate attosecond X-ray pulses, which would enable real-time imaging of cellular machinery in action.


“We’re building a tool for the chemists, physicists, and biologists who use X-ray light sources or the electron beams directly to do their research,” says Ronny Huang, an MIT PhD student in electrical engineering and first author on the new paper. “Because these electron beams are so short, they allow you to kind of freeze the motion of electrons inside molecules as the molecules are undergoing a chemical reaction. A femtosecond X-ray light source requires more hardware, but it utilizes electron guns.”


In particular, Huang explains, with a technique called electron diffraction imaging, physicists and chemists use ultrashort bursts of electrons to investigate phase changes in materials, such as the transition from an electrically conductive to a nonconductive state, and the creation and dissolution of bonds between molecules in chemical reactions.


Ultrashort X-ray pulses have the same advantages that ordinary X-rays do: They penetrate more deeply into thicker materials. The current method for producing ultrashort X-rays involves sending electron bursts from a car-sized electron gun through a billion-dollar, kilometer-long particle accelerator that increases their velocity. Then they pass between two rows of magnets — known as an “undulator” — that converts them to X-rays.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science!

NIST debuts dual atomic clock—and a new stability record

NIST debuts dual atomic clock—and a new stability record | Amazing Science |

What could be better than a world-leading atomic clock? Two clocks in one.

Physicists at the National Institute of Standards and Technology (NIST) have combined two experimental atomic clocks based on ytterbium atoms to set yet another world record for clock stability. Stability can be thought of as how precisely the duration of each clock tick matches every other tick that comes before and after.


This extraordinary stability makes the ytterbium lattice clock a more powerful tool for precision tests such as whether the "fundamental constants" of nature are really constant, and searches for the elusive dark matter purported to make up much of the universe. The experiment demonstrating the double-clock design is reported in Nature Photonics.


"We eliminated a critical type of noise in the clock's operation, effectively making the clock signal stronger," NIST physicist Andrew Ludlow said. "This means we can reach a clock instability of 1.5 parts in a quintillion (1 followed by 18 zeros) in just a few thousand seconds. While this only slightly beats the record level of clock stability we demonstrated a few years ago, we get there 10 times faster."


NIST atomic clocks routinely perform at very high levels, but scientists continually tweak them to reduce slight imperfections. The new double-clock design eliminates a small but significant distortion in the laser frequency that probes and synchronizes with the atoms. The more stable the clock, the better its measurement power.


The new ytterbium lattice 'double clock' is the most stable clock in the world, although another NIST atomic clock, based on strontium and located at JILA, holds the world record for precision. Precision refers to how closely the clock tunes itself to the natural frequency at which the atoms oscillate between two electronic energy levels.

Via Mariaschnee
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science!

Quantum computers can talk to each other via a photon translator

Quantum computers can talk to each other via a photon translator | Amazing Science |

Different kinds of quantum computers encode information using their own wavelengths of light, but a device that modifies their photons could allow them to network.


Quantum computers are theoretically capable of running calculations exponentially faster than classical computers, and can be made by exploiting atoms, superconductors, diamond crystals and more. Each of these has its own strengths: atoms are better at storing information, while superconductors are better at processing it. A device linking these diverse systems together would combine their strengths and compensate for their weaknesses. Once linked, these systems would talk to each other by sending and receiving photons. The photons would encode quantum states but, unlike the voltages and currents interpreted by a classical computer chip, they cannot be transmitted via copper wires.


What’s more, quantum rules require that a single photon must essentially carry a spread of frequencies, rather than a single frequency. For different components to talk to each other using photons, the spread of the sender’s photons must therefore be converted to the spread that the receiver can handle. That requires a device in the middle that can convert photons from one spread of frequencies to another, while still preserving their delicate quantum state.


Christine Silberhorn of the University of Paderborn in Germany and her colleagues have designed such a system. It includes a converter that “translates” photons emitted from one component into the infrared. That infrared photon is then transmitted over a fibre optic cable connected to a second component. Finally, the photon is translated into another frequency that the receiving component can read.


Only part of the system has been built so far: the researchers have managed to convert infrared photons to a visible wavelength – while leaving their quantum state intact – with a success rate of about 75 per cent. But the technique could be adapted to build the full system, Silberhorn says.


Once that is done, the next step would be to figure out how to fit the device on a chip that could be manufactured easily and cheaply in large quantities, says Arka Majumdar of the University of Washington in Seattle. “The science works,” he says. “But scalability is the biggest problem. Making the same device 1000 times is extremely difficult.”

Via Mariaschnee
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Strange Numbers Found in Particle Collisions

Strange Numbers Found in Particle Collisions | Amazing Science |

Periods and amplitudes were presented together for the first time in 1994 by Kreimer and David Broadhurst, a physicist at the Open University in England, with a paper following in 1995. The work led mathematicians to speculate that all amplitudes were periods of mixed Tate motives — a special kind of motive named after John Tate, emeritus professor at Harvard University, in which all the periods are multiple values of one of the most influential constructions in number theory, the Riemann zeta function. In the situation with an electron-positron pair going in and a muon-antimuon pair coming out, the main part of the amplitude comes out as six times the Riemann zeta function evaluated at three.


If all amplitudes were multiple zeta values, it would give physicists a well-defined class of numbers to work with. But in 2012 Brown and his collaborator Oliver Schnetz proved that’s not the case. While all the amplitudes physicists come across today may be periods of mixed Tate motives, “there are monsters lurking out there that throw a spanner into the works,” Brown said. Those monsters are “certainly periods, but they’re not the nice and simple periods people had hoped for.”

What physicists and mathematicians do know is that there seems to be a connection between the number of loops in a Feynman diagram and a notion in mathematics called “weight.”


Weight is a number related to the dimension of the space being integrated over: A period integral over a one-dimensional space can have a weight of 0, 1 or 2; a period integral over a two-dimensional space can have weight up to 4, and so on. Weight can also be used to sort periods into different types: All periods of weight 0 are conjectured to be algebraic numbers, which can be the solutions to polynomial equations (this has not been proved); the period of a pendulum always has a weight of 1; pi is a period of weight 2; and the weights of values of the Riemann zeta function are always twice the input (so the zeta function evaluated at 3 has a weight of 6).


This classification of periods by weights carries over to Feynman diagrams, where the number of loops in a diagram is somehow related to the weight of its amplitude. Diagrams with no loops have amplitudes of weight 0; the amplitudes of diagrams with one loop are all periods of mixed Tate motives and have, at most, a weight of 4. For graphs with additional loops, mathematicians suspect the relationship continues, even if they can’t see it yet.


“We go to higher loops and we see periods of a more general type,” Kreimer said. “There mathematicians get really interested because they don’t understand much about motives that are not mixed Tate motives.”


Mathematicians and physicists are currently going back and forth trying to establish the scope of the problem and craft solutions. Mathematicians suggest functions (and their integrals) to physicists that can be used to describe Feynman diagrams. Physicists produce configurations of particle collisions that outstrip the functions mathematicians have to offer. “It’s quite amazing to see how fast they’ve assimilated quite technical mathematical ideas,” Brown said. “We’ve run out of classical numbers and functions to give to physicists.”

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Close to absolute zero, electrons exhibit their quantum nature

Close to absolute zero, electrons exhibit their quantum nature | Amazing Science |

What would happen if an electric current no longer flowed, but trickled instead? This was the question investigated by researchers working with Christian Ast at the Max Planck Institute for Solid State Research. Their investigation involved cooling their scanning tunnelling microscope down to a fifteen thousandth of a degree above absolute zero. At these extremely low temperatures, the electrons reveal their quantum nature. The electric current is therefore a granular medium, consisting of individual particles. The electrons trickle through a conductor like grains of sand in an hourglass, a phenomenon that can be explained with the aid of quantum electrodynamics.


Flowing water from a tap feels like a homogeneous medium - it is impossible to distinguish between the individual water molecules. Exactly the same thing is true about electric current. So many electrons flow in a conventional cable that the current appears to be homogeneous. Although it is not possible to distinguish individual electrons, quantum mechanics says they should exist. So how do they behave? Under which conditions does the current not flow like water through a tap, but rather trickles like sand in an hourglass?


The hourglass analogy is very appropriate for the scanning tunneling microscope, where a thin, pointed tip scans across the surface of a sample without actually touching it. A tiny current flows nevertheless, as there is a slight probability that electrons "tunnel" from the pointed tip into the sample. This tunneling current is an exponential function of the separation, which is why the pointed tip is located only a few Ångström above the sample.


Minute variations in the tunneling current thus allow researchers to resolve individual atoms and atomic structures on surfaces and investigate their electronic structure. Scanning tunneling microscopes are therefore some of the most versatile and sensitive detectors in the whole of solid state physics.


Even under these extreme conditions – a tiny current of less than one billionth of the current that flows through a 100-watt light bulb – billions of electrons per second still flow. This is too many to discern individual electrons. The temperature was down at around a fifteen thousandth of a degree above absolute zero (i.e. at minus 273.135°C or 15 mK) before the scientists saw that the electric current consists of individual electrons.


At this low temperature, very fine structures, which the researchers had not expected, appear in the spectrum. "We could explain these new structures only by assuming that the tunneling current is a granular medium and no longer homogeneous," says Ast, who heads the group working with the scanning tunneling microscope. This is thus the first time that the full quantum nature of electronic transport in the scanning tunneling microscope has shown itself.

No comment yet.
Rescooped by Dr. Stefan Gruenwald from SciFrye!

Units of measure are getting a fundamental upgrade

Units of measure are getting a fundamental upgrade | Amazing Science |

If scientists had sacred objects, this would be one of them: a single, closely guarded 137-year-old cylinder of metal, housed in a vault outside of Paris. It is a prototype that precisely defines a kilogram of mass everywhere in the universe. A kilogram of ground beef at the grocery store has the same mass as this one special hunk of metal, an alloy of platinum and iridium. A 60-kilogram woman has a mass 60 times as much. Even far-flung astronomical objects such as comets are measured relative to this all-important cylinder: Comet 67P/Churyumov–Gerasimenko, which was recently visited by the European Space Agency’s Rosetta spacecraft (SN: 2/21/15, p. 6), has a mass of about 10 trillion such cylinders.


But there’s nothing special about that piece of metal, and its mass isn’t even perfectly constant — scratches or gunk collecting on its surface could change its size subtly (SN: 11/20/10, p. 12). And then a kilogram of beef would be slightly more or less meat than it was before. That difference would be too small to matter when flipping burgers, but for precise scientific measurements, a tiny shift in the definition of the kilogram could cause big problems.


That issue nags at some researchers. They would prefer to define important units — including kilograms, meters and seconds — using immutable properties of nature, rather than arbitrary lengths, masses and other quantities dreamed up by scientists. If humans were to make contact with aliens and compare systems of units, says physicist Stephan Schlamminger, “we’d be the laughingstock of the galaxy.”


To set things right, metrologists — a rare breed of scientist obsessed with precise measurements — are revamping the system. Soon, they will use fundamental constants of nature — unchanging numbers such as the speed of light, the charge of an electron and the quantum mechanical Planck constant — to calibrate their rulers, scales and thermometers. They’ve already gotten rid of an artificial standard that used to define the meter — an engraved platinum-iridium bar. In 2018, they plan to jettison the Parisian kilogram cylinder, too.

Via Kim Frye
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Graphene's sleeping superconductivity awakens

Graphene's sleeping superconductivity awakens | Amazing Science |
Researchers have found a way to trigger the innate, but previously hidden, ability of graphene to act as a superconductor - meaning that it can be made to carry an electrical current with zero resistance.


The finding, reported in Nature Communications, further enhances the potential of graphene, which is already widely seen as a material that could revolutionise industries such as healthcare and electronics. Graphene is a two-dimensional sheet of carbon atoms and combines several remarkable properties; for example, it is very strong, but also light and flexible, and highly conductive.


Since its discovery in 2004, scientists have speculated that graphene may also have the capacity to be a superconductor. Until now, superconductivity in graphene has only been achieved by doping it with, or by placing it on, a superconducting material - a process which can compromise some of its other properties. But in the new study, researchers at the University of Cambridge managed to activate the dormant potential for graphene to superconduct in its own right. This was achieved by coupling it with a material called praseodymium cerium copper oxide (PCCO).


Superconductors are already used in numerous applications. Because they generate large magnetic fields they are an essential component in MRI scanners and levitating trains. They could also be used to make energy-efficient power lines and devices capable of storing energy for millions of years.


Superconducting graphene opens up yet more possibilities. The researchers suggest, for example, that graphene could now be used to create new types of superconducting quantum devices for high-speed computing. Intriguingly, it might also be used to prove the existence of a mysterious form of superconductivity known as "p-wave" superconductivity, which academics have been struggling to verify for more than 20 years.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Theory provides roadmap in quest for quark soup 'critical point'

Theory provides roadmap in quest for quark soup 'critical point' | Amazing Science |

Thanks to a new development in nuclear physics theory, scientists exploring expanding fireballs that mimic the early universe have new signs to look for as they map out the transition from primordial plasma to matter as we know it. The theory work, described in a paper recently published as an Editor's Suggestion in Physical Review Letters (PRL), identifies key patterns that would be proof of the existence of a so-called "critical point" in the transition among different phases of nuclear matter. Like the freezing and boiling points that delineate various phases of water -- liquid, solid ice, and steam -- the points nuclear physicists seek to identify will help them understand fundamental properties of the fabric of our universe.


Nuclear physicists create the fireballs by colliding ordinary nuclei -- made of protons and neutrons -- in an "atom smasher" called the Relativistic Heavy Ion Collider (RHIC), a U.S. Department of Energy Office of Science User Facility at Brookhaven National Laboratory. The subatomic smashups generate temperatures measuring trillions of degrees, hot enough to "melt" the protons and neutrons and release their inner building blocks -- quarks and gluons. The collider essentially turns back the clock to recreate the "quark-gluon plasma" (QGP) that existed just after the Big Bang. By tracking the particles that emerge from the fireballs, scientists can learn about nuclear phase transitions -- both the melting and how the quarks and gluons "freeze out" as they did at the dawn of time to form the visible matter of today's world.


"We want to understand the properties of QGP," said nuclear theorist Raju Venugopalan, one of the authors on the new paper. "We don't know how those properties might be used, but 100 years ago, we didn't know how we'd use the collective properties of electrons, which now form the basis of almost all of our technologies. Back then, electrons were just as exotic as the quarks and gluons are now."

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Is the universe fine-tuned for life to evolve?

Is the universe fine-tuned for life to evolve? | Amazing Science |
Geraint F. Lewis’ day job involves creating synthetic universes on supercomputers. They can be overwhelmingly bizarre, unstable places. The question that compels him is: how did our universe come to be so perfectly tuned for stability and life?

For more than 400 years, physicists treated the universe like a machine, taking it apart to see how it ticks. The surprise is it turns out to have remarkably few parts: just leptons and quarks and four fundamental forces to glue them together.

But those few parts are exquisitely machined. If we tinker with their settings, even slightly, the universe as we know it would cease to exist. Science now faces the question of why the universe appears to have been “fine-tuned” to allow the appearance of complex life, a question that has some potentially uncomfortable answers.


No comment yet.
Scooped by Dr. Stefan Gruenwald!

The center of Earth is younger than the outer surface

The center of Earth is younger than the outer surface | Amazing Science |
Einstein’s general theory of relativity predicts the center of the Earth is two years younger than the crust.


Our home planet is young at heart. According to new calculations, Earth’s center is more than two years younger than its surface. In Einstein’s general theory of relativity, massive objects warp the fabric of spacetime, creating a gravitational pull and slowing time nearby. So a clock placed at Earth’s center will tick ever-so-slightly slower than a clock at its surface. Such time shifts are determined by the gravitational potential, a measure of the amount of work it would take to move an object from one place to another. Since climbing up from Earth’s center would be a struggle against gravity, clocks down deep would run slow relative to surface time pieces.


Over the 4.5 billion years of Earth’s history, the gradual shaving off of fractions of a second adds up to a core that’s 2.5 years younger than the planet’s crust, researchers estimate in the May European Journal of Physics. Theoretical physicist Richard Feynman had suggested in the 1960s that the core was younger, but only by a few days.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Fine-Tuning of Our Universe: Are We Special?

Fine-Tuning of Our Universe: Are We Special? | Amazing Science |
Why do the deep physical laws of our universe seem just right for our existence? What does a "fine-tuned" universe mean? What is the far future of intelligence, human or alien, in the universe?
No comment yet.
Rescooped by Dr. Stefan Gruenwald from Fragments of Science!

Exotic insulator may hold clue to key mystery of modern physics

Exotic insulator may hold clue to key mystery of modern physics | Amazing Science |

Experiments using laser light and pieces of gray material the size of fingernail clippings may offer clues to a fundamental scientific riddle: what is the relationship between the everyday world of classical physics and the hidden quantum realm that obeys entirely different rules?

"We found a particular material that is straddling these two regimes," said N. Peter Armitage, an associate professor of physics at Johns Hopkins University who led the research for the paper just published in the journal Science. Six scientists from Johns Hopkins and Rutgers University were involved in the work on materials called topological insulators, which can conduct electricity on their atoms-thin surface, but not in their insides.


Topological insulators were predicted in the 1980s, first observed in 2007 and have been studied intensively since. Made from any number of hundreds of elements, these materials have the capacity to show quantum properties that usually appear only at the microscopic level, but here appear in a material visible to the naked eye.


The experiments reported in Science establish these materials as a distinct state of matter "that exhibits macroscopic quantum mechanical effects," Armitage said. "Usually we think of quantum mechanics as a theory of small things, but in this system quantum mechanics is appearing on macroscopic length scales. The experiments are made possible by unique instrumentation developed in my laboratory."


In the experiments reported in Science, dark gray material samples made of the elements bismuth and selenium – each a few millimeters long and of different thicknesses—were hit with "THz" light beams that are invisible to the unaided eye. Researchers measured the reflected light as it moved through the material samples, and found fingerprints of a quantum state of matter.

Via Mariaschnee
No comment yet.
Scooped by Dr. Stefan Gruenwald!

Researchers Developed World's First Water-Wave Laser

Researchers Developed World's First Water-Wave Laser | Amazing Science |

Technion researchers have demonstrated, for the first time, that laser emissions can be created through the interaction of light and water waves. This “water-wave laser” could someday be used in tiny sensors that combine light waves, sound and water waves, or as a feature on microfluidic “lab-on-a-chip” devices used to study cell biology and to test new drug therapies.


For now, the water-wave laser offers a “playground” for scientists studying the interaction of light and fluid at a scale smaller than the width of a human hair, the researchers write in the new report, published November 21 in the journal Nature Photonics.

The study was conducted by Technion-Israel Institute of Technology students Shmuel Kaminski, Leopoldo Martin, and Shai Maayani, under the supervision of Professor Tal Carmon, head of the Optomechanics Center at the Mechanical Engineering Faculty at Technion. Carmon said the study is the first bridge between two areas of research that were previously considered unrelated to one another: nonlinear optics and water waves.


A typical laser can be created when the electrons in atoms become “excited” by energy absorbed from an outside source, causing them to emit radiation in the form of laser light. Professor Carmon and his colleagues now show for the first time that water wave oscillations within a liquid device can also generate laser radiation.


The possibility of creating a laser through the interaction of light with water waves has not been examined, Carmon said, mainly due to the huge difference between the low frequency of water waves on the surface of a liquid (approximately 1,000 oscillations per second) and the high frequency of light wave oscillations (10^14 oscillations per second). This frequency difference reduces the efficiency of the energy transfer between light and water waves, which is needed to produce the laser emission.


To compensate for this low efficiency, the researchers created a device in which an optical fiber delivers light into a tiny droplet of octane and water. Light waves and water waves pass through each other many times (approximately one million times) inside the droplet, generating the energy that leaves the droplet as the emission of the water-wave laser.


The interaction between the fiber optic light and the miniscule vibrations on the surface of the droplet are like an echo, the researchers noted, where the interaction of sound waves and the surface they pass through can make a single scream audible several times. In order to increase this echo effect in their device, the researchers used highly transparent, runny liquids, to encourage light and droplet interactions.

Carlos Garcia Pando's comment, December 9, 2016 6:14 AM
Looks very interesting as optomechanics amplifier
Rescooped by Dr. Stefan Gruenwald from Conformable Contacts!

Theory that challenges Einstein’s physics could soon be put to the test

Theory that challenges Einstein’s physics could soon be put to the test | Amazing Science |
Scientists behind a theory that the speed of light is variable – and not constant as Einstein suggested – have made a prediction that could be tested.


Einstein observed that the speed of light remains the same in any situation, and this meant that space and time could be different in different situations. The assumption that the speed of light is constant, and always has been, underpins many theories in physics, such as Einstein’s theory of general relativity. In particular, it plays a role in models of what happened in the very early universe, seconds after the Big Bang.


But some researchers have suggested that the speed of light could have been much higher in this early universe. Now, one of this theory’s originators, Professor João Magueijo from Imperial College London, working with Dr Niayesh Afshordi at the Perimeter Institute in Canada, has made a prediction that could be used to test the theory’s validity.


Structures in the universe, for example galaxies, all formed from fluctuations in the early universe – tiny differences in density from one region to another. A record of these early fluctuations is imprinted on the cosmic microwave background – a map of the oldest light in the universe – in the form of a ‘spectral index’.


Working with their theory that the fluctuations were influenced by a varying speed of light in the early universe, Professor Magueijo and Dr Afshordi have now used a model to put an exact figure on the spectral index. The predicted figure and the model it is based on are published in the journal Physical Review D.

Via YEC Geo
YEC Geo's curator insight, November 27, 2016 10:37 AM
Really interesting--hadn't heard of this before.
Scooped by Dr. Stefan Gruenwald!

Alien life could be so advanced it becomes indistinguishable from physics

Alien life could be so advanced it becomes indistinguishable from physics | Amazing Science |

Caleb Scharf, an astrophysicist at Columbia University, recently published an article that posits alien life may be so far advanced that we cannot tell it apart from the laws of physics. Is it possible that the universe is potentially teeming with intelligence, it’s just so far ahead of us that it has literally become part of the fabric of space and time? Scharf provides a few ideas on how this might be the case, so let’s unpack this whopper of a speculative hypothesis. He begins with a logical first step, the machine singularity.


Assuming a civilization survives long enough, the ability to bridge the gap between a biological brain and an artificial machine brain may be possible. Once an intelligence becomes one with a computer, and the computational power of the brain exponentially increases, the brain/computer could reach a ‘singularity’, a point at which the entire understanding of the universe becomes child’s play. While this is purely speculative, Scharf points out that some machinations of the universe are so strange that the singularity hypothesis might fit the bill. For example, only 5% of the universe’s mass-energy consists of matter; a larger chunk, about 27%, is spooky dark matter that exists only in models and math, and is yet imperceptible to us.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

Controlling Electrons in Space and Time With Extremely High Accuracy

Controlling Electrons in Space and Time With Extremely High Accuracy | Amazing Science |

Sharp metal needles can be used to emit electrons. A quantum effect opens up new possibilities of controlling electron emission with extremely high accuracy.


In an electron microscope, electrons are emitted by pointy metal tips, that way the can be steered and controlled with high precision. Recently, such metal tips have also been used as high precision electron sources for generating x-rays. A team of researchers at TU Wien (Vienna), together with colleagues from the FAU Erlangen-Nürnberg (Germany), have developed a method of controlling electron emission with higher precision than ever before. With the help of two different laser pulses it is now possible to switch the flow of electrons on and off on extremely short time scales.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

First observation how quantum superposition builds up in a helium atom within femtoseconds

First observation how quantum superposition builds up in a helium atom within femtoseconds | Amazing Science |

In the double slit experiment, a particle travels on two different paths at the same time. Something similar can be observed when a helium atom is ionized with a laser beam. The ionization of helium can happen via two different processes, and this leads to characteristic interference effects. A team of scientists has now managed to observe the buildup up of these effects—even though this effect takes place on a time scale of femtoseconds.


It is definitely the most famous experiment in quantum physics: in the double slit experiment, a particle is fired onto a plate with two parallel slits, so there are two different paths on which the particle can reach the detector on the other side. Due to its quantum properties, the particle does not have to choose between these two possibilities, it can pass through both slits at the same time. Something quite similar can be observed when a helium atom is ionized with a laser beam.


Just like the two paths through the plate, the ionization of helium can happen via two different processes at the same time, and this leads to characteristic interference effects. In the case of the helium atom, they are called "Fano resonances". A team of scientists from TU Wien (Vienna, Austria), the Max-Planck Institute for Nuclear Physics in Heidelberg (Germany) and Kansas State University (USA) has now managed to observe the buildup up of these Fano resonances—even though this effect takes place on a time scale of femtoseconds.


The experiment was performed in Heidelberg, the original proposal for such an experiment and computer simulations were developed by the team from Vienna, additional theoretical calculations came from Kansas State University. The study is published in Science. In the same issue of Science magazine, a team of scientists from France and Spain has published another paper, in which a complementary method of time-resolved photoelectron spectroscopy is used to obtain a view on the Fano resonance.

No comment yet.
Scooped by Dr. Stefan Gruenwald!

CERN experiment improves precision of antiproton mass measurement with new innovative cooling technique

CERN experiment improves precision of antiproton mass measurement with new innovative cooling technique | Amazing Science |

Exotic molecule tests fundamental symmetry


Spectroscopy of exotic molecules can offer insight into fundamental physics. Hori et al. studied the transition frequencies of an unusual helium atom in which one of the two electrons was substituted by an antiproton, the negatively charged antiparticle partner of the proton. The antiprotonic helium was cooled down to low temperatures to allow the frequencies to be measured with high precision. The extracted mass of the antiproton (relative to the electron mass) was in good agreement with previous measurements of the proton mass. This finding is in keeping with the implications of the combined charge, parity, and time-reversal symmetry of physical laws.


Charge, parity, and time reversal (CPT) symmetry implies that a particle and its antiparticle have the same mass. The antiproton-to-electron mass ratio can be precisely determined from the single-photon transition frequencies of antiprotonic helium.


The physicists measured 13 such frequencies with laser spectroscopy to a fractional precision of 2.5 × 10−9 to 16 × 10−9. About 2 × 109 antiprotonic helium atoms were cooled to temperatures between 1.5 and 1.7 kelvin by using buffer-gas cooling in cryogenic low-pressure helium gas; the narrow thermal distribution led to the observation of sharp spectral lines of small thermal Doppler width. The deviation between the experimental frequencies and the results of three-body quantum electrodynamics calculations was reduced by a factor of 1.4 to 10 compared with previous single-photon experiments.


From this, was determined as 1836.1526734(15), which agrees with a recent proton-to-electron experimental value within 8 × 10−10.

No comment yet.